首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper we deepen Mundici's analysis on reducibility of the decision problem from infinite-valued ukasiewicz logic to a suitable m-valued ukasiewicz logic m , where m only depends on the length of the formulas to be proved. Using geometrical arguments we find a better upper bound for the least integer m such that a formula is valid in if and only if it is also valid in m. We also reduce the notion of logical consequence in to the same notion in a suitable finite set of finite-valued ukasiewicz logics. Finally, we define an analytic and internal sequent calculus for infinite-valued ukasiewicz logic.  相似文献   

2.
A powerful methodology for scenario-based specification of reactive systems is described, in which the behavior is played in directly from the systems GUI or some abstract version thereof, and can then be played out. The approach is supported and illustrated by a tool, which we call the play-engine. As the behavior is played in, the play-engine automatically generates a formal version in an extended version of the language of live sequence charts (LSCs). As they are played out, it causes the application to react according to the universal (must) parts of the specification; the existential (may) parts can be monitored to check their successful completion. Play-in is a user-friendly high-level way of specifying behavior and play-out is a rather surprising way of working with a fully operational system directly from its inter-object requirements. The ideas appear to be relevant to many stages of system development, including requirements engineering, specification, testing, analysis and implementation.  相似文献   

3.
In this paper, we propose a two-layer sensor fusion scheme for multiple hypotheses multisensor systems. To reflect reality in decision making, uncertain decision regions are introduced in the hypotheses testing process. The entire decision space is partitioned into distinct regions of correct, uncertain and incorrect regions. The first layer of decision is made by each sensor indepedently based on a set of optimal decision rules. The fusion process is performed by treating the fusion center as an additional virtual sensor to the system. This virtual sensor makes decision based on the decisions reached by the set of sensors in the system. The optimal decision rules are derived by minimizing the Bayes risk function. As a consequence, the performance of the system as well as individual sensors can be quantified by the probabilities of correct, incorrect and uncertain decisions. Numerical examples of three hypotheses, two and four sensor systems are presented to illustrate the proposed scheme.  相似文献   

4.
The tensor language of system engineering is applied for formalized structural synthesis of a pipeline processor for measurement data. The formalized synthesis procedure has produced efficient data processing block pipeline and data processing block pipeline assembly structures.Translated from Kibernetika i Sistemnyi Analiz, No. 6, pp. 29–45, November–December, 1991.  相似文献   

5.
We show that a mixed state = mnamn|mn| can be realized by an ensemble of pure states {pk, |k} where . Employing this form, we discuss the relative entropy of entanglement of Schmidt correlated states. Also, we calculate the distillable entanglement of a class of mixed states. PACS: 03.67.-a; 03.65.Bz; 03.65.Ud  相似文献   

6.
Self-Repairing Mechanical Systems   总被引:4,自引:0,他引:4  
This paper reviews several types of self-repairing systems developed in the Mechanical Engineering Laboratory. We have developed a modular system capable of self-assembly and self-repair. The former means a set of units can form a given shape of the system without outside help; the latter means the system restores the original shape if an arbitrary part of the system is cut off. We show both two-dimensional and three-dimensional unit designs, and distributed algorithms for the units.  相似文献   

7.
We analyze four nce Memed novels of Yaar Kemal using six style markers: most frequent words, syllable counts, word type – or part of speech – information, sentence length in terms of words, word length in text, and word length in vocabulary. For analysis we divide each novel into five thousand word text blocks and count the frequencies of each style marker in these blocks. The style markers showing the best separation are most frequent words and sentence lengths. We use stepwise discriminant analysis to determine the best discriminators of each style marker. We then use these markers in cross validation based discriminant analysis. Further investigation based on multiple analysis of variance (MANOVA) reveals how the attributes of each style marker group distinguish among the volumes.  相似文献   

8.
This paper proposes the use of accessible information (data/knowledge) to infer inaccessible data in a distributed database system. Inference rules are extracted from databases by means of knowledge discovery techniques. These rules can derive inaccessible data due to a site failure or network partition in a distributed system. Such query answering requires combining incomplete and partial information from multiple sources. The derived answer may be exact or approximate. Our inference process involves two phases to reason with reconstructed information. One phase involves using local rules to infer inaccessible data. A second phase involves merging information from different sites. We shall call such reasoning processes cooperative data inference. Since the derived answer may be incomplete, new algebraic tools are developed for supporting operations on incomplete information. A weak criterion called toleration is introduced for evaluating the inferred results. The conditions that assure the correctness of combining partial results, known as sound inference paths, are developed. A solution is presented for terminating an iterative reasoning process on derived data from multiple knowledge sources. The proposed approach has been implemented on a cooperative distributed database testbed, CoBase, at UCLA. The experimental results validate the feasibility of this proposed concept and can significantly improve the availability of distributed knowledge base/database systems.List of notation Mapping - --< Logical implication - = Symbolic equality - ==< Inference path - Satisfaction - Toleration - Undefined (does not exist) - Variable-null (may or may not exist) - * Subtuple relationship - * s-membership - s-containment - Open subtuple - Open s-membership - Open s-containment - P Open base - P Program - I Interpretation - DIP Data inference program - t Tuples - R Relations - Ø Empty interpretation - Open s-union - Open s-interpretation - Set of mapping from the set of objects to the set of closed objects - W Set of attributes - W Set of sound inference paths on the set of attributes W - Set of relational schemas in a DB that satisfy MVD - + Range closure of W wrt   相似文献   

9.
The adaptiveness of agents is one of the basic conditions for the autonomy. This paper describes an approach of adaptiveness forMonitoring Cognitive Agents based on the notion of generic spaces. This notion allows the definition of virtual generic processes so that any particular actual process is then a simple configuration of the generic process, that is to say a set of values of parameters. Consequently, generic domain ontology containing the generic knowledge for solving problems concerning the generic process can be developed. This lead to the design of Generic Monitoring Cognitive Agent, a class of agent in which the whole knowledge corpus is generic. In other words, modeling a process within a generic space becomes configuring a generic process and adaptiveness becomes genericity, that is to say independence regarding technology. In this paper, we present an application of this approach on Sachem, a Generic Monitoring Cognitive Agent designed in order to help the operators in operating a blast furnace. Specifically, the NeuroGaz module of Sachem will be used to present the notion of a generic blast furnace. The adaptiveness of Sachem can then be noted through the low cost of the deployment of a Sachem instance on different blast furnaces and the ability of NeuroGaz in solving problem and learning from various top gas instrumentation.  相似文献   

10.
This paper describes the redesign of a systems engineering language called . This is an engineering language designed to specify and analyse industrial systems. The main objective of this redesign was to enable mathematical reasoning about specifications. We discuss the original language, the requirements and design decisions, and the resulting syntax and semantics of the new version of , called . In particular, we elaborate on semantical aspects of s time model.  相似文献   

11.
For a given polynomial-time computable honest function, the complexity of its max inverse function is compared with that of the other inverse functions. Two structural results are shown which suggest that the max inverse function is not the easiest.The preliminary version of this paper was presented at the International Symposium SIGAL 90 [WT]. Osamu Watanabe was supported in part by a Grant in Aid for Scientific Research of the Ministry of Education, Science and Culture of Japan under Grant-in-Aid for Co-operative Research (A) 02302047 (1990).  相似文献   

12.
Existing methods for exploiting flawed domain theories depend on the use of a sufficiently large set of training examples for diagnosing and repairing flaws in the theory. In this paper, we offer a method of theory reinterpretation that makes only marginal use of training examples. The idea is as follows: Often a small number of flaws in a theory can completely destroy the theory's classification accuracy. Yet it is clear that valuable information is available even from such flawed theories. For example, an instance with severalindependent proofs in a slightly flawed theory is certainly more likely to be correctly classified as positive than an instance with only a single proof.This idea can be generalized to a numerical notion of degree of provedness which measures the robustness of proofs or refutations for a given instance. This degree of provedness can be easily computed using a soft interpretation of the theory. Given a ranking of instances based on the values so obtained, all that is required to classify instances is to determine some cutoff threshold above which instances are classified as positive. Such a threshold can be determined on the basis of a small set of training examples.For theories with a few localized flaws, we improve the method by rehardening: interpreting only parts of the theory softly, while interpreting the rest of the theory in the usual manner. Isolating those parts of the theory that should be interpreted softly can be done on the basis of a small number of training examples.Softening, with or without rehardening, can be used by itself as a quick way of handling theories with suspected flaws where few training examples are available. Additionally softening and rehardening can be used in conjunction with other methods as a meta-algorithm for determining which theory revision methods are appropriate for a given theory.  相似文献   

13.
This paper uses Thiele rational interpolation to derive a simple method for computing the Randles–Sevcik function 1/2(x), with relative error at most 1.9 × 10–5 for – < x < . We develop a piecewise approximation method for the numerical computation of 1/2(x) on the union (–, –10) [–10, 10] (10, ). This approximation is particularly convenient to employ in electrochemical applications where four significant digits of accuracy are usually sufficient. Although this paper is primarily concerned with the approximation of the Randles–Sevcik function, some examples are included that illustrate how Thiele rational interpolation can be employed to generate useful approximations to other functions of interest in scientific work.  相似文献   

14.
In this paper, a high-speed digital processed microscopic observational system for telemicrooperation is proposed with a dynamic focusing system and a high-speed digital-processing system using the depth from focus criterion. In our previous work [10], we proposed a system that could simultaneously obtain an all-in-focus image as well as the depth of an object. In reality, in a microoperation, it is not easy to obtain good visibility of objects with a microscope focused at a shallow depth, especially in microsurgery and DNA studies, among other procedures. In this sense, the all-in-focus image, which keeps an in-focus texture over the entire object, is useful for observing microenvironments with the microscope. However, one drawback of the all-in-focus image is that there is no information about the objects depth. It is also important to obtain a depth map and show the 3D microenvironments at any view angle in real time to actuate the microobjects. Our earlier system with a dynamic focusing lens and a smart sensor could obtain the all-in-focus image and the depth in 2 s. To realize real-time microoperation, a system that could process at least 30 frames per second (60 times faster than the previous system) would be required. This paper briefly reviews the depth from focus criterion to Simultaneously achieve the all-in-focus image and the reconstruction of 3D microenvironments. After discussing the problem inherent in our earlier system, a frame-rate system constructed with a high-speed video camera and FPGA (field programmable gate array) hardware is discussed. To adapt this system for use with the microscope, new criteria to solve the ghost problem in reconstructing the all-in-focus image are proposed. Finally, microobservation shows the validity of this system.Received: 12 August 2001, Accepted: 17 July 2002, Published online: 12 November 2003 Correspondence to: Kohtaro Ohba  相似文献   

15.
A central component of the analysis of panel clustering techniques for the approximation of integral operators is the so-called -admissibility condition min {diam(),diam()} 2dist(,) that ensures that the kernel function is approximated only on those parts of the domain that are far from the singularity. Typical techniques based on a Taylor expansion of the kernel function require a subdomain to be far enough from the singularity such that the parameter has to be smaller than a given constant depending on properties of the kernel function. In this paper, we demonstrate that any is sufficient if interpolation instead of Taylor expansionisused for the kernel approximation, which paves the way for grey-box panel clustering algorithms.  相似文献   

16.
The regional and global environmental perturbations resulting from the effects of human economic activity on fundamental biological, chemical, and physical systems can no longer be ignored. Indeed, a complex and expanding set of statutory and regulatory responses, directed in large part at industrial and manufacturing activity, demonstrates society's increasing understanding of this process. These developments have driven the evolution of industry environmental compliance and management systems. In particular, integrated chemical management systems (ICMS), which consist of two subsystems — a data subsystem and a management subsystem, have become both more complex and more integrated into traditional business operating and management systems. Three stages in ICMS evolution are defined and described: stage I, the presystemic stage; stage II, the static system stage; and stage III, the interactive system stage.  相似文献   

17.
Summary A framework is proposed for the structured specification and verification of database dynamics. In this framework, the conceptual model of a database is a many sorted first order linear tense theory whose proper axioms specify the update and the triggering behaviour of the database. The use of conceptual modelling approaches for structuring such a theory is analysed. Semantic primitives based on the notions of event and process are adopted for modelling the dynamic aspects. Events are used to model both atomic database operations and communication actions (input/output). Nonatomic operations to be performed on the database (transactions) are modelled by processes in terms of trigger/reaction patterns of behaviour. The correctness of the specification is verified by proving that the desired requirements on the evolution of the database are theorems of the conceptual model. Besides the traditional data integrity constraints, requirements of the form Under condition W, it is guaranteed that the database operation Z will be successfully performed are also considered. Such liveness requirements have been ignored in the database literature, although they are essential to a complete definition of the database dynamics.

Notation

Classical Logic Symbols (Appendix 1) for all (universal quantifier) - exists at least once (existential quantifier) - ¬ no (negation) - implies (implication) - is equivalent to (equivalence) - and (conjunction) - or (disjunction) Tense Logic Symbols (Appendix 1) G always in the future - G 0 always in the future and now - F sometime in the future - F 0 sometime in the future or now - H always in the past - H 0 always in the past and now - P sometime in the past - P 0 sometime in the past or now - X in the next moment - Y in the previous moment - L always - M sometime Event Specification Symbols (Sects. 3 and 4.1) (x) means immediately after the occurrence of x - (x) means immediately before the occurrence of x - (x) means x is enabled, i.e., x may occur next - { } ({w 1} x{w 2}) states that if w 1 holds before the occurrence of x, then w 2 will hold after the occurrence of x (change rule) - [ ] ([oa1, ..., oan]x) states that only the object attributes oa1, ..., oa n are modifiable by x (scope rule) - {{ }} ({{w}}x) states that if x may occur next, then w holds (enabling rule) Process Specification Symbols (Sects. 5.3 and 5.4) :: for causal rules - for behavioural rules Transition-Pattern Composition Symbols (Sects. 5.2 and 5.3) ; sequential composition - ¦ choice composition - parallel composition - :| guarded alternative composition Location Predicates (Sect. 5.2) (z) means immediately after the occurrence of the last event of z (after) - (z) means immediately before the occurrence of the first event of z (before) - (z) means after the beginning of z and before the end of z (during) - ( z) means before the occurrence of an event of z (at)  相似文献   

18.
A version of topology's fundamental group is developed for digital images in dimension at most 3 in [7] and [8]. In the latter paper, it is shown that such a digital image X , k 3, has a continuous analog C(X) Rk such that X has digital fundamental group isomorphic to 1(C(X)). However, the construction of the digital fundamental group in [7] and [8] does not greatly resemble the classical construction of the fundamental group of a topological space. In the current paper, we show how classical methods of algebraic topology may be used to construct the digital fundamental group. We construct the digital fundamental group based on the notions of digitally continuous functions presented in [10] and digital homotopy [3]. Our methods are very similar to those of [6], which uses different notions of digital topology. We show that the resulting theory of digital fundamental groups is related to that of [7] and [8] in that it yields isomorphic fundamental groups for the digital images considered in the latter papers (for certain connectedness types).  相似文献   

19.
Ward Elliott (from 1987) and Robert Valenza (from 1989) set out to the find the true Shakespeare from among 37 anti-Stratfordian Claimants. As directors of the Claremont Shakespeare Authorship Clinic, Elliott and Valenza developed novel attributional tests, from which they concluded that most Claimants are not-Shakespeare. From 1990-4, Elliott and Valenza developed tests purporting further to reject much of the Shakespeare canon as not-Shakespeare (1996a). Foster (1996b) details extensive and persistent flaws in the Clinic's work: data were collected haphazardly; canonical and comparative text-samples were chronologically mismatched; procedural controls for genre, stanzaic structure, and date were lacking. Elliott and Valenza counter by estimating maximum erosion of the Clinic's findings to include five of our 54 tests, which can amount, at most, to half of one percent (1998). This essay provides a brief history, showing why the Clinic foundered. Examining several of the Clinic's representative tests, I evaluate claims that Elliott and Valenza continue to make for their methodology. A final section addresses doubts about accuracy, validity and replicability that have dogged the Clinic's work from the outset.  相似文献   

20.
This paper reports on the status of The University of Texas at Arlington student effort to design, build and fly an Autonomous Aerial Vehicle. Both the 1991 entry into the First International Aerial Robotics Competition as well as refinements being made for 1992 are described. Significant technical highlights include a real-time vision system for target objective tracking, a real-time ultrasonic locator system for position sensing, a novel mechanism for gradually moving from human to computer control, and a hierarchical control structure implemented on a 32-bit microcontroller. Detailed discussion about the design of multivariable automatic controls for stability augmentation is included. Position and attitude control loops are optimized according to a combined 2 and criteria. We present a modification of a recently published procedure for recovering a desired open-loop transfer function shape within the framework of the mixed 2/ problem. This work has led to a new result that frees a design parameter related to imposing the constraint. The additional freedom can be used to improve upon the performance and robustness characteristics of the system.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号