首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.
Conventional information science generally considers an information process, but traditionally uses the probability measure for random events, and Shannon’s entropy measure as an uncertainty function of the states. The cutting process on separated states decreases quantity information concealed in the states correlation, holding hidden process information.

Up to now, “information process” has not had a scientifically conclusive definition nor its implicit structure.

The presenting information process performs logical operations with discrete information units (Bits) to achieve a goal, integrating the discrete mutually connected sequence of symbols and the extracting process’ hidden information in the structure of an information Observer. The probing time-space observation develops the unit of space-time geometry-memorizing logic.

The defined information process starts generating observations of a random process via the logic of probing impulses, sequentially cutting the process entropy measure and creating the discrete information units whose integration enfolds the information geometrical structure of an Observer. Found composite stages of the information process and the synthesized optimal process trajectory minimize observation time in an artificially designed information Observer with intellectual searching logic. The analytical modeling, computer simulations, and experimental applications validate the results.  相似文献   


2.
The concept of a sensor with finite resolution is chosen as the cornerstone of a physical measurement theory. It is concluded that the concept of a band-limited channel is less appropriate for this purpose because it is defined with help of specific mathematical operators on functions that are defined on an infinite interval.

It is shown that measurement in continuum physics requires transfer, transformation and transport of energy to, by, and through a sensor: a non-equilibrium process. As theory requires equilibrium for any measurement, a paradox arises. It is indicated that experimental physics avoids this paradox with help of a hierarchy of space and time scales. The definitions and propositions are formulated in a similar way in order to avoid this equilibrium/non-equilibrium paradox.

The equivalence of the various forms of energy, as stated by the first law of thermodynamics and verified by experimental physics, renders interpretation of physical measurement relatively easy.

Signals are defined to be products of measurement. They are carriers of a finite amount of information. As the class of continuous functions generate an infinite amount of information, discrete functions are selected to represent sampled and digitized signals. If the sampling interval is of the order of the largest characteristic time scale of the total system, then little information is lost by digitization. The number of significant digits is related to the resolution of the sensor and of the other parts of the measuring instrument. A measuring instrument is shown to be composed of a chain or a set of chains of sensors.  相似文献   


3.
In the practice of Japanese company-wide quality control, some tools which help us to coordinate company-wide activities are developed. One of the main functions of these tools is to organise the diversified information related to the company-wide activities

This report discusses the basic recognition underlying the use of these tools, and presents on a conceptual level an essential philosophy and methodology of coordinating company-wide activities

The basic idea of this report is to make an additional subsystem which helps us to organize the diversified information of activities in order to coordinate activities easily.  相似文献   


4.
Many attempts1, 7, 8, 35 have been made to overcome the limit imposed by the Turing Machine34 to realise general mathematical functions and models of (physical) phenomena.

They center around the notion of computability.

In this paper we propose a new definition of computability which lays the foundations for a theory of cybernetic and intelligent machines in which the classical limits imposed by discrete algorithmic procedures are offset by the use of continuous operators on unlimited data. This data is supplied to the machine in a totally parallel mode, as a field or wave.

This theory of machines draws its concepts from category theory, Lie algebras, and general systems theory. It permits the incorporation of intelligent control into the design of the machine as a virtual element. The incorporated control can be realized in many (machine) configurations of which we give three:

a) a quantum mechanical realization appropriate to a possible understanding of the quantum computer and other models of the physical microworld,

b) a stochastic realization based on Kolmogorov-Gabor theory leading to a possible understanding of generalised models of the physical or thermodynamic macroworld, and lastly

c) a classical mechanical realization appropriate lo the study of a new class of robots.

Particular applications at a fundamental level are cited in geometry, mathematics, biology, acoustics, aeronautics, quantum mechanics, general relativity and. Markov chains. The proposed theory therefore opens a new way towards understanding the processes that underlie intelligence.  相似文献   


5.
The last few years have seen the development of Discrete Event-Dynamic Net Systems1,2 as instruments for modeling complex systems. They are able to achieve the following objectives:

—formality of the modeling methodology

—ability to model static and dynamic aspects

—ability to pass between levels of differently rich structures by morphisms

—uniform representation of the communication process as

—an information process

—a decision process and

—a control process

—homogeneity of the representation and modeling methods

—ability to derive qualitative and quantitative statements.

The foundation is provided by a Discrete Event-Dynamic Net System which includes the axiomatic declaration of general Petri nets. In order to calculate the structural and dynamic aspects, so-called Petri net machines are developed. It is shown that this approach can even be used to treat the following aspects:

—use of time during the process

—increase of costs during the generation and transportation of information

—augmentation, evaluation and transformation of information objects.

Recursive formulas are derived and some examples calculated.  相似文献   


6.
Because the range of mobile robot sensors is limited and navigation maps are not always accurate, autonomous navigation in dynamic and unknown environments is a big challenge. In this article, we propose two novel autonomous navigation algorithms, which are based on the analysis of three conditions for unobserved and uncertain environments during navigation.

The algorithm for a dynamic environment uses the “known space” and “free space” conditions. It corrects false obstacles in the map when the conventional path is stuck. The navigation algorithm for unknown environments uses the “unknown space” and “free space” conditions. We use the Monte Carlo method to evaluate the performance of our algorithms and the other methods. Experimental results show that our autonomous navigation algorithms are better than the others.  相似文献   


7.
The authors introduce and explain core concepts of cybersecurity through six engaging practical scenarios. Presented as case studies, the scenarios illustrate how experts may reason through security challenges managing trust and information in the adversarial cyber world. The concepts revolve around adversarial thinking, including understanding the adversary; defining security goals; identifying targets, vulnerabilities, threats, and risks; and devising defenses. They also include dealing with confidentiality, integrity, availability (known as the “CIA triad”), authentication, key management, physical security, and social engineering. The authors hope that these scenarios will inspire students to explore this vital area more deeply.

The target audience is anyone who is interested in learningabout cybersecurity, including those with little to no background in cybersecurity. This article will also interest those who teach cybersecurity and are seeking examples and structures for explaining its concepts. For students and educators, the authors include selected misconceptions they observed in student responses to scenarios. The contributions are novel educational case studies, not original technical research.

The scenarios comprise responding to an e-mail about lost luggage containing specifications of a new product, delivering packages by drones, explaining a suspicious database input error, designing a corporate network that separates public and private segments, verifying compliance with the Nuclear Test Ban Treaty, and exfiltrating a USB stick from a top-secret government facility.  相似文献   


8.
Like most criminal subsets of society, cyber criminals are going to very when it comes to motivation and capability. Cybercrime talent ranges from entrepreneurial “lone wolf” brute force attackers such as: Nathan Wyatt (who may or may not be) affiliated with “The Dark Overloads” to Evgeniy Mikhailovich Bogachev, the creator of the stealthy and sophisticated ZeuS Banking Trojan.

Law enforcement’s work on disassembling the Citadel Banking Trojan in 2013, a variant of the ZeuS Baking Trojan brought light multiple tiers of cybercriminals. Dimitry Belorossov a/k/a Rainerfox, was alleged to have operated a Citadel command and control server ultimately controlling 7,000 victim computers. This cybercriminal essentially purchased Citadel – really nothing more than a “user”. Stepping up one level of cyber-crime capability we have the extradition and arrest of Mark Vartanyan, a/k/a Kolypto, wo allegedly developed, improved and maintained the Citadel banking Trojan – a malware developer. Finally, at the top end of the cybercriminal scale we have “Aquabox” the alleged creator of the Citadel Banking. In 2013, it was announced the Citadel Banking Trojan responsible for stealing $500 million. Aquabox is still at large.

This journal paper explores cybercriminal actors from a unique perspective: by the type of the attack they conduct and the relationship of the malicious actor to the victim business. Understanding the relationship of the actor to the type of attack inflicted provides an understanding of the motivation of the individual or group. Understanding the motivation of these criminals can provide valuable insight into countering both insider and external threats.  相似文献   


9.
FUZZY SETS AND SYSTEMS*   总被引:1,自引:0,他引:1  
The notion of fuzziness as defined in this paper relates to situations in which the source of imprecision is not a random variable or a stochastic process, but rather a class or classes which do not possess sharply defined boundaries, e.g., the “class of bald men,” or the “class of numbers which are much greater than 10,” or the “class of adaptive systems,” etc.

A basic concept which makes it possible to treat fuzziness in a quantitative manner is that of a fuzzy set, that is, a class in which there may be grades of membership intermediate between full membership and non-membership. Thus, a fuzzy set is characterized by a membership function which assigns to each object its grade of membership (a number lying between 0 and 1) in the fuzzy set.

After a review of some of the relevant properties of fuzzy sets, the notions of a fuzzy system and a fuzzy class of systems are introduced and briefly analyzed. The paper closes with a section dealing with optimization under fuzzy constraints in which an approach to problems of this type is briefly sketched.  相似文献   


10.
The construction of automatic control and modelling environments has been attempted using shallow reasoning expert systems. The inadequacy of this approach for real-life systems has become apparent, and the need for deeper knowledge—which can only be obtained by extensive simulation—is now acknowledged.

The simulation process, which consists of model building and model selection, followed by the generation and execution of a software simulator, can be partially automated if an Object-Oriented methodology is adopted. In this article, a methodology is presented, which is presently under investigation in the DESiRE (Dynamic Expert Systems in Robotic Experimentation)project

In the modelling phase, a hierarchical, uniform way of describing and manipulating continuous and discrete models is needed, if the highly desirable reusability of submodels is to be achieved. This is only possible if a clear distinction is made between bare models (and information about their intrinsic coupling) and causal simulation experiment-related data.

In the simulation phase, before generating numerical simulation code, a symbolic reduction of the continuous parts of the model is performed, thus eliminating inaccuracy introduced by the untimely application of possibly unstable numerical algorithms.

Finally, from the reduced representation, executable simulator-objects are produced for use in a distributed environment.  相似文献   


11.
This paper is a study of an adaptive quality control system from a viewpoint of quality goal. The main purpose is to introduce a conceptual framework for setting up quality goals which are in accordance with the external environment and internal capacity of quality control systems. We apply a mathematical general systems approach.

The results of our paper are summarized as follows:

1)As important decision-making in adaptive QCS, the decision-making for estimalion of fitness and examination of attainability are proposed and formalized mathematically.

2)A refinement process is formalized in which models are revised according to the changes of market.

3)Basic steps for setting quality goals are obtained which are based on the above formalization.  相似文献   


12.
Purpose: Identify location and intensity of discomfort experienced by healthy participants wearing cervical orthoses.

Method: Convenience sample of 34 healthy participants wore Stro II, Philadelphia, Headmaster, and AspenVista® cervical orthoses for four-hour periods. Participants reported discomfort level (scale 0–6) and location.

Results: Participants reported mean discomfort for all orthoses over the four-hour test between ‘a little discomfort’ and ‘very uncomfortable’ (mean discomfort score = 1.64, SD = 1.50). Seven participants prematurely stopped tests due to pain and six reported maximum discomfort scores. Significant linear increase in discomfort with duration of wear was found for all orthoses. Significantly less discomfort was reported with Stro II than Headmaster and Philadelphia. Age correlated with greater perceived discomfort. Orthoses differed in the location discomfort was experienced.

Conclusion: Existing cervical orthoses cause discomfort influenced by design and duration of wear with orthoses’ design the more significant factor. This work informed the design of a new orthosis and future orthoses developments.

Practitioner Summary: The purpose of this study was to gain greater knowledge about the discomfort caused by wearing of existing neck orthoses in order to inform the design and development of a new neck orthosis. This study gathers empirical data from a surrogate population and concludes that orthosis design is more influential than the duration of wear.  相似文献   


13.
Managing chronic illness requires personal health information management (PHIM) to be performed by lay individuals. Paramount to understanding the PHIM process is understanding the sociotechnical system in which it frequently occurs: the home environment. We combined distributed cognition theory and the patient work system model to investigate how characteristics of the home interact with the cognitive work of PHIM. We used a 3D virtual reality CAVE that enabled participants who had been diagnosed with diabetes (N = 20) to describe how they would perform PHIM in the home context. We found that PHIM is distinctly cognitive work, and rarely performed ‘in the head’. Rather, features of the physical environment, tasks, people, and tools and technologies present, continuously shape and are shaped by the PHIM process. We suggest that approaches in which the individual (sans context) is considered the relevant unit of analysis overlook the pivotal role of the environment in shaping PHIM.

Practitioner Summary:

We examined how Personal Health Information Management (PHIM) is performed in the homes of diabetic patients. We found that approaches to studying cognition that focus on the individual, to the exclusion of their context, overlook the pivotal role of environmental, social, and technological features in shaping PHIM.  相似文献   


14.
The new discipline of reconstructability analysis has provided a powerful framework for the study of the relationships between parts and wholes. The concentration of effort has been on systems with probabilistic or possibilistic behavior functions. This paper extends aspects of reconstructability analysis to general functions which need not be behavior functions. We refer to a system with such a function as a g-system.

First, a g-system is transformed to a dimensionless form (borrowing a term from partial differential equations). A mathematical structure is then induced via a type of isomorphism onto this system which renders it amenable to analysis by established techniques in reconstructability analysis. Absolutely no restrictions are placed on the units or mathematical structure of the g-system. We refer to the system induced from the g-system as a Klir system or k-system. These systems are named in honor of the reconstructability analysis founder.

Finally, we explicate some uses of k-systems induced from given g-systems. k-systems have easy immediate application (e.g. minimal storage of system information), but, more importantly, they render accessible new tracts of system dynamics studies.  相似文献   


15.
Possibilistic distributions admit both measures of uncertainty and (metric) distances defining their information closeness. For general pairs of distributions these measures and metrics were first introduced in the form of integral expressions. Particularly important are pairs of distributions p and q which have consonant ordering—for any two events x and y in the domain of discourse p(x)⪋ p(y) if and only if q(x) ⪋ q(y). We call such distributions confluent and study their information distances.

This paper presents discrete sum form of uncertainty measures of arbitrary distributions, and uses it to obtain similar representations of metrics on the space of confluent distributions. Using these representations, a number of properties like additivity. monotonicity and a form of distributivity are proven. Finally, a branching property is introduced, which will serve (in a separate paper) to characterize axiomatically possibilistic information distances.  相似文献   


16.
This article explains, demonstrates, and evaluates Chaum’s protocol for detecting a man-in-the-middle (MitM) of text-messaging network communications. MitM attacks pose serious risks to many network communications. Networks often mitigate these risks with robust protocols, such as TLS, which assume some type of public-key infrastructure that provides a mechanism for the authenticated exchange of public keys. By contrast, Chaum’s protocol aims to detect a MitM with minimal assumptions and technology, and in particular without assuming the authenticated exchange of public keys. Chaum assumes that the eavesdropper can “sound like” the communicants but that the eavesdropper cannot fabricate sensible conversations.

Using an encryption function and one-way function, Chaum’s protocol works in three phases. In Phase I, the communicants exchange their public keys. In Phase II, each communicant generates a random string. The first communicant cryptographically commits to that string, and sends the string to the other communicant after receiving the other’s string. In Phase III, using any of four different “scenarios” the communicants verify that each possesses the same two strings. The protocol forces any MitM to cause the communicants to possess different pairs of strings. The text-messaging scenario is similar to a forced-latency protocol proposed by Wilcox-O’Hearn in 2003.

This article implements and experimentally demonstrates the effectiveness of the third scenario, which uses timing to detect a MitM in text-messaging. Even assuming a MitM can send messages without any network latency, the protocol forces the MitM to cause delays noticeable by the communicants. This article is the first to explain, demonstrate, and evaluate Chaum’s protocol, which Chaum described only in an abandoned and nearly inscrutable patent application.  相似文献   


17.
The Problem

Internet of Things (IoT) is providing new services and insights by sensing contextual data but there are growing concerns of privacy risks from users that need immediate attention.

The Reason

The IoT devices and smart services can capture Personally Identifiable Information (PII) without user knowledge or consent. The IoT technology has not reached the desired level of maturity to standardize security and privacy requirements.

The Solution

IoT Privacy by Design is a user-centric approach for enabling privacy with security and safety as a ‘win-win’ positive outcome of IoT offerings, irrespective of business domain. The Proactive and Preventive Privacy (3P) Framework proposed in this paper should be adopted by the IoT stakeholders for building trust and confidence in end users about IoT devices and smart services.  相似文献   


18.
In this paper, we demonstrate how the proper modelling of general systems allows us to address the question of how the actions of many independent but inter-connected agents contribute to a global behaviour. In particular, we apply the techniques of information theory to probabilistic automata to formalize and prove what has come to be known as the ?Von-Foerster conjecture”.

In the first part of this paper we describe Von-Foerster's conjecture in its historical context. In the second part we restate it using formal definitions and we prove it.  相似文献   


19.
Book reviews     
Patent Harmonization Harold C. Wegner London, Sweet & Maxwell, 1993 xx + 376 pp., £70.00 (hardback)

Of Authors and Origins: essays on copyright law Brad Sherman & Alain Strowel (Eds) Oxford, Clarendon Press, 1994 xiv + 313 pp., £30.00 (hardback)

Future Air Navigation Systems Werner Guldimann & Stefan Kaiser Dordrecht, Martinus Nijhoff, 1993 Utrecht Studies in Air and Space Law, Vol. 13 viii + 281 pp., £76.50 (hardback)

The Law of Copyright Terence Prime London, Fourmat Publishing, 1992, xxiv + 313 pp., £30.00 (paperback)  相似文献   


20.
The problem of illegal waste burial is a serious threat to human health and ecosystems. These illegal activities occur most often in areas which have been heavily modified and considered degraded, mainly quarry areas and landfills, even if it is licensed.

To identify the areas where there is suspicion of illegal waste burial, a great amount of time is often necessary. As a matter of fact, the detection of suspicious areas is often based on a comparison of available airborne images with high temporal and spatial resolution, and surveys. This is also done to gather significant aspect of change as deep soil reshuffle in small excavation areas. These methods however, take a long time to be applied because large areas have to be analysed within sub-areas. Consequently, there is loss of understanding related to a wider and complete phenomenon that allows us to have information of all the territory that has undergone heavy transformations, especially if due to illegal conduct of waste disposal. As a result, there are no enough data to strategically evaluate how to counteract such illegal activities.

Since these areas have already been heavily disturbed and subjected to continuous changes over time, one or more detection techniques in a standardized procedure might only be applicable in some contexts. The above method comparison and survey is certainly a useful tool for detecting areas affected by excavation activities or waste illegal disposal. It also provides information where significant transformations have occurred in a short time.

The purpose of this work is the development of a processing procedure with Landsat images applicable to agro-ecosystems that can produce results in heavily transformed areas where suspicion of illegal waste burial is high.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号