首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 126 毫秒
1.
This paper describes the redesign of a systems engineering language called . This is an engineering language designed to specify and analyse industrial systems. The main objective of this redesign was to enable mathematical reasoning about specifications. We discuss the original language, the requirements and design decisions, and the resulting syntax and semantics of the new version of , called . In particular, we elaborate on semantical aspects of s time model.  相似文献   

2.
Let (X, #) be an orthogonality space such that the lattice C(X, #) of closed subsets of (X, #) is orthomodular and let (, ) denote the free orthogonality monoid over (X, #). Let C0(, ) be the subset of C(, ), consisting of all closures of bounded orthogonal sets. We show that C0(, ) is a suborthomodular lattice of C(, ) and we provide a necessary and sufficient condition for C0(, ) to carry a full set of dispersion free states.The work of the second author on this paper was supported by National Science Foundation Grant GP-9005.  相似文献   

3.
In this paper, we propose a two-layer sensor fusion scheme for multiple hypotheses multisensor systems. To reflect reality in decision making, uncertain decision regions are introduced in the hypotheses testing process. The entire decision space is partitioned into distinct regions of correct, uncertain and incorrect regions. The first layer of decision is made by each sensor indepedently based on a set of optimal decision rules. The fusion process is performed by treating the fusion center as an additional virtual sensor to the system. This virtual sensor makes decision based on the decisions reached by the set of sensors in the system. The optimal decision rules are derived by minimizing the Bayes risk function. As a consequence, the performance of the system as well as individual sensors can be quantified by the probabilities of correct, incorrect and uncertain decisions. Numerical examples of three hypotheses, two and four sensor systems are presented to illustrate the proposed scheme.  相似文献   

4.
For branching-time temporal logic based on an Ockhamist semantics, we explore a temporal language extended with two additional syntactic tools. For reference to the set of all possible futures at a moment of time we use syntactically designated restricted variables called fan-names. For reference to all possible futures alternative to the actual one we use a modification of a difference modality, localized to the set of all possible futures at the actual moment of time.We construct an axiomatic system for this extended branching-time logic and prove its soundness and completeness with respect to bundle tree semantics. Finally, we show how our axiomatic system can be extended with a variety of important additional operators, such as Since and Until, a global difference operator, operators for undivided and divided histories, reference pointers, etc.  相似文献   

5.
When interpolating incomplete data, one can choose a parametric model, or opt for a more general approach and use a non-parametric model which allows a very large class of interpolants. A popular non-parametric model for interpolating various types of data is based on regularization, which looks for an interpolant that is both close to the data and also smooth in some sense. Formally, this interpolant is obtained by minimizing an error functional which is the weighted sum of a fidelity term and a smoothness term.The classical approach to regularization is: select optimal weights (also called hyperparameters) that should be assigned to these two terms, and minimize the resulting error functional.However, using only the optimal weights does not guarantee that the chosen function will be optimal in some sense, such as the maximum likelihood criterion, or the minimal square error criterion. For that, we have to consider all possible weights.The approach suggested here is to use the full probability distribution on the space of admissible functions, as opposed to the probability induced by using a single combination of weights. The reason is as follows: the weight actually determines the probability space in which we are working. For a given weight , the probability of a function f is proportional to exp(– f2 uu du) (for the case of a function with one variable). For each different , there is a different solution to the restoration problem; denote it by f. Now, if we had known , it would not be necessary to use all the weights; however, all we are given are some noisy measurements of f, and we do not know the correct . Therefore, the mathematically correct solution is to calculate, for every , the probability that f was sampled from a space whose probability is determined by , and average the different f's weighted by these probabilities. The same argument holds for the noise variance, which is also unknown.Three basic problems are addressed is this work: Computing the MAP estimate, that is, the function f maximizing Pr(f/D) when the data D is given. This problem is reduced to a one-dimensional optimization problem. Computing the MSE estimate. This function is defined at each point x as f(x)Pr(f/D) f. This problem is reduced to computing a one-dimensional integral.In the general setting, the MAP estimate is not equal to the MSE estimate. Computing the pointwise uncertainty associated with the MSE solution. This problem is reduced to computing three one-dimensional integrals.  相似文献   

6.
Summary For a family of languages , CAL() is defined as the family of images of under nondeterministic two-way finite state transducers, while FINITE · VISIT() is the closure of under deterministic two-way finite state transducers; CAL0()= and for n0, CAL n+1()=CAL n (CAL()). For any semiAFL , if FINITE · VISIT() CAL(), then CAL n () forms a proper hierarchy and for every n0, FINITE · VISIT(CALn()) CAL n+1() FINITE · VISIT(CAL n+1()). If is a SLIP semiAFL or a weakly k-iterative full semiAFL or a semiAFL contained in any full bounded AFL, then FINITE · VISIT() CAL() and in the last two cases, FINITE · VISIT(). If is a substitution closed full principal semiAFL and FINITE · VISIT(), then FINITE · VISIT() CAL(). If is a substitution closed full principal semiAFL generated by a language without an infinite regular set and 1 is a full semiAFL, then is contained in CALm(1) if and only if it is contained in 1. Among the applications of these results are the following. For the following families , CAL n () forms a proper hierarchy: =INDEXED, =ETOL, and any semiAFL contained in CF. The family CF is incomparable with CAL m (NESA) where NESA is the family of one-way nonerasing stack languages and INDEXED is incomparable with CAL m (STACK) where STACK is the family of one-way stack languages.This work was supported in part by the National Science Foundation under Grants No. DCR74-15091 and MCS-78-04725  相似文献   

7.
A type inference system and a big-step operational semantics for expressions of the Object Constraint Language (OCL), the declarative and navigational constraint language for the Unified Modeling Language (UML), are provided; the account is mainly based on OCL 1.4/5, but also includes the main features of OCL 2.0. The formal systems are parameterised in terms of UML static structures and UML object models, which are treated abstractly. It is proved that the operational semantics satisfies a subject reduction property with respect to the type inference system. Proceeding from the operational semantics and providing a denotational semantics, pure OCL 2.0 expressions are shown to exactly represent the primitive recursive functions, whereas pure OCL 1.4/5 expressions are Turing complete.  相似文献   

8.
How to Pass a Turing Test   总被引:1,自引:0,他引:1  
I advocate a theory of syntactic semantics as a way of understanding how computers can think (and how the Chinese-Room-Argument objection to the Turing Test can be overcome): (1) Semantics, considered as the study of relations between symbols and meanings, can be turned into syntax – a study of relations among symbols (including meanings) – and hence syntax (i.e., symbol manipulation) can suffice for the semantical enterprise (contra Searle). (2) Semantics, considered as the process of understanding one domain (by modeling it) in terms of another, can be viewed recursively: The base case of semantic understanding –understanding a domain in terms of itself – is syntactic understanding. (3) An internal (or narrow), first-person point of view makes an external (or wide), third-person point of view otiose for purposes of understanding cognition.  相似文献   

9.
The simple rational partial functions accepted by generalized sequential machines are shown to coincide with the compositions P –1 , where P consists of the prefix codings. The rational functions accepted by generalized sequential machines are proved to coincide with the compositions P –1 , where is the family of endmarkers and is the family of removals of endmarkers. (The compositions are read from left to right). We also show that P –1 is the family of the subsequential functions.This work was partially supported by the Esprit Basic Research Action Working Group No. 3166 ASMICS, the CNRS and the Academy of Finland  相似文献   

10.
Harnad's proposed robotic upgrade of Turing's Test (TT), from a test of linguistic capacity alone to a Total Turing Test (TTT) of linguisticand sensorimotor capacity, conflicts with his claim that no behavioral test provides even probable warrant for attributions of thought because there is no evidence of consciousness besides private experience. Intuitive, scientific, and philosophical considerations Harnad offers in favor of his proposed upgrade are unconvincing. I agree with Harnad that distinguishing real from as if thought on the basis of (presence or lack of) consciousness (thus rejecting Turing (behavioral) testing as sufficient warrant for mental attribution)has the skeptical consequence Harnad accepts — there is in factno evidence for me that anyone else but me has a mind. I disagree with hisacceptance of it! It would be better to give up the neo-Cartesian faith in private conscious experience underlying Harnad's allegiance to Searle's controversial Chinese Room Experiment than give up all claim to know others think. It would be better to allow that (passing) Turing's Test evidences — evenstrongly evidences — thought.  相似文献   

11.
Using a form of ST-operational semantics we develop a noninterleaving semantic theory of processes based on testing. This operational semantics is based on the assumption that all actions have a non-zero duration and the allowed tests can therefore distinguish between the beginning and the termination of actions. The result is a semantic theory in which concurrency is differentiated from nondeterminism.We show that the semantic preorder based on these tests is preserved by so-called stable action refinements and may be characterised as the largest such preorder contained in the standard testing preorder.This work has been supported by the ESPRIT/BRA CEDISYS project and the ESPRC project HI6357  相似文献   

12.
This paper discusses some of the key drivers that will enable businesses to operate effectively on-line, and looks at how the notion of website will become one of an on-line presence which will support the main activities of an organisation. This is placed in the context of the development of the information society which will allow individuals-as consumers or employees-quick, inexpensive and on-demand access to vast quantities of entertainment, services and information. The paper draws on an example of these developments in Australasia.  相似文献   

13.
Synthetic Domain Theory (SDT) is a constructive variant of Domain Theory where all functions are continuous following Dana Scotts idea of domains as sets. Recently there have been suggested more abstract axiomatizations encompassing alternative notions of domain theory as, for example, stable domain theory.In this article a logical and axiomatic version of SDT capturing the essence of Domain Theory à la Scott is presented. It is based on a sufficiently expressive version of constructive type theory and fully implemented in the proof checker Lego. On top of this core SDT denotational semantics and program verification can be – and in fact has been – developed in a purely formal machine-checked way.The version of SDT we have chosen for this purpose is based on work by Reus and Streicher and can be regarded as an axiomatization of complete extensional PERs. This approach is a modification of Phoas complete -spaces and uses axioms introduced by Taylor.  相似文献   

14.
A well-known problem in default logic is the ability of naive reasoners to explain bothg and ¬g from a set of observations. This problem is treated in at least two different ways within that camp.One approach is examination of the various explanations and choosing among them on the basis of various explanation comparators. A typical comparator is choosing the explanation that depends on the most specific observation, similar to the notion of narrowest reference class.Others examine default extensions of the observations and choose whatever is true in any extension, or what is true in all extensions or what is true in preferred extensions. Default extensions are sometimes thought of as acceptable models of the world that are discarded as more knowledge becomes available.We argue that the notions of specificity and extension lack clear semantics. Furthermore, we show that the problems these ideas were supposed to solve can be handled easily within a probabilistic framework.  相似文献   

15.
This paper presents generated enhancements for robust two and three-quarter dimensional meshing, including: (1) automated interval assignment by integer programming for submapped surfaces and volumes, (2) surface submapping, and (3) volume submapping. An introduction to the simplex method, an optimization technique of integer programming, is presented. Simplification of complex geometry is required for the formulation of the integer programming problem. A method of i-j unfolding is defined which explains how irregular geometry can be realigned into a simplified form that is suitable for submap interval assignment solutions. Also presented is the processes by which submapping eliminates the decomposition of surface geometry, through a pseudodecomposition process, producing suitable mapped meshes. The process of submapping involves the creation of interpolated virtual edges, user defined vertex types and i-j-k space traversals. The creation of interpolated virtual edges is the method by which submapping automatically subdivides surface geometry. The interpolated virtual edge is formulated according to an interpolation scheme using the node discretization of curves on the surface. User defined vertex types allow direct user control of surface decomposition and interval assignment by modifying i-j-k space traversals. Volume submapping takes the geometry decomposition to a higher level by using mapped virtual surfaces to eliminate decomposition of complex volumes.  相似文献   

16.
The adaptiveness of agents is one of the basic conditions for the autonomy. This paper describes an approach of adaptiveness forMonitoring Cognitive Agents based on the notion of generic spaces. This notion allows the definition of virtual generic processes so that any particular actual process is then a simple configuration of the generic process, that is to say a set of values of parameters. Consequently, generic domain ontology containing the generic knowledge for solving problems concerning the generic process can be developed. This lead to the design of Generic Monitoring Cognitive Agent, a class of agent in which the whole knowledge corpus is generic. In other words, modeling a process within a generic space becomes configuring a generic process and adaptiveness becomes genericity, that is to say independence regarding technology. In this paper, we present an application of this approach on Sachem, a Generic Monitoring Cognitive Agent designed in order to help the operators in operating a blast furnace. Specifically, the NeuroGaz module of Sachem will be used to present the notion of a generic blast furnace. The adaptiveness of Sachem can then be noted through the low cost of the deployment of a Sachem instance on different blast furnaces and the ability of NeuroGaz in solving problem and learning from various top gas instrumentation.  相似文献   

17.
Previous studies of A* tree-searching have modeled heuristics as random variables. The average number of nodes expanded is expressed asymptotically in terms of distance to goal. The conclusion reached is that A* complexity is an exponential function of heuristic error: Polynomial error implies exponential complexity and logarithmic accuracy is required for polynomial complexity.This paper eliminates simplifying assumptions of earlier studies. Error is replaced by a concept called discrepancy, a measure of the relative attractiveness to A* of a node for expansion when that node is compared with competing nodes on the solution path. According to our model, in order to have polynomial A* complexity, it is not necessary to have the logarithmic accuracy described in previous studies. Another way is for a heuristic's values to vary, or cluster, near a central function which grows at least as fast as distance to goal. Generally, logarithmic variation or less is adequate. For one class of heuristics considered, the faster this central function grows, the more is variation from it tolerated.This research has been funded by NCR Corporation.This research has been partially funded by a grant from NCR Corporation.  相似文献   

18.
A linear evolution equation for a thermodynamic variable F, odd under time-reversal, is obtained from the exact equation derived by Robertson from the Liouville equation for the information-theoretic phase-space distribution. One obtains an exact expression for , the relaxation time for F. For very short , is time-independent for t > if C(t) F{exp(-i t)}Fo, the equilibrium time correlation, decays exponentially for t > . is the Liouville operator. So long as C(t) is such that decays rapidly to a steady-state value, the t limit of agrees with the fluctuation-dissipation theorem in applications to fluid transport.  相似文献   

19.
Summary Very much space is needed to store the values of all attribute instances in an attributed tree at the corresponding nodes; for that reason global cells are often used to store values of attribute instances. But these global cells must contain the right value at the right time, and, therefore, not all evaluation sequences of attribute instances are admissible, if one uses global cells.In this paper we will study first the problem arising during the construction of such admissible evaluation sequences for attributed trees, if no special property of an underlying ag is presumed. This will lead to a number of restrictions on the practically allowed use of global cells. After that we will provide a method for the construction of admissible evaluation sequences for arbitrary attribute trees of given attribute grammars, if global cells are used in the restricted sense. The proposed method is independent of special classes of attribute grammars and can be used with arbitrary evaluator generators.  相似文献   

20.
The derivative based approach to solve the optimal toll problem is demonstrated in this paper for a medium scale network. It is shown that although the method works for most small problems with only a few links tolled, it fails to converge for larger scale problems. This failure led to the development of an alternative genetic algorithm (GA) based approach for finding optimal toll levels for a given set of chargeable links. A variation on the GA based approach is used to identify the best toll locations making use of location indices suggested by Verhoef (2002).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号