首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
In this paper, an objective conception of contexts based loosely upon situation theory is developed and formalized. Unlike subjective conceptions, which take contexts to be something like sets of beliefs, contexts on the objective conception are taken to be complex, structured pieces of the world that (in general) contain individuals, other contexts, and propositions about them. An extended first-order language for this account is developed. The language contains complex terms for propositions, and the standard predicate ist that expresses the relation that holds between a context and a proposition just in case the latter is true in the former. The logic for the objective conception features a global classical predicate calculus, a local logic for reasoning within contexts, and axioms for propositions. The specter of paradox is banished from the logic by allowing ist to be nonbivalent in problematic cases: it is not in general the case, for any context c and proposition p, that either ist(c,p) or ist(c, ¬ p). An important representational capability of the logic is illustrated by proving an appropriately modified version of an illustrative theorem from McCarthy's classic Blocks World example.  相似文献   

2.
It is shown that the translation of an open default into a modal formula x(L(x)LM 1 (x)...LM m (x)w(x)) gives rise to an embedding of open default systems into non-monotonic logics.  相似文献   

3.
Summary In this paper we study the generative capacity of EOL forms from two different points of view. On the one hand, we consider the generative capacity of special EOL forms which one could call linear like and context free like, establishing the existence of a rich variety of non-regular sub-EOL language families. On the other hand, we propose the notion of a generator L of a language family We mean by this that any synchronized EOL system generating L generates — if understood as an EOL form — all languages of . We characterize the generators of the family of regular languages, and prove that other well known language families do not have generators.Partially supported under NSE Research Council of Canada, grant No. A-7700  相似文献   

4.
Dr. T. Ström 《Computing》1972,10(1-2):1-7
It is a commonly occurring problem to find good norms · or logarithmic norms (·) for a given matrix in the sense that they should be close to respectively the spectral radius (A) and the spectral abscissa (A). Examples may be the certification thatA is convergent, i.e. (A)A<1 or stable, i.e. (A)(A)<0. Often the ordinary norms do not suffice and one would like to try simple modifications of them such as using an ordinary norm for a diagonally transformed matrix. This paper treats this problem for some of the ordinary norms.
Minimisierung von Normen und Logarithmischen Normen durch Diagonale Transformationen
Zusammenfassung Ein oft vorkommendes praktisches Problem ist die Konstruktion von guten Normen · und logarithmischen Normen (·) für eine gegebene MatrixA. Mit gut wird dann verstanden, daß A den Spektralradius (A)=max |1| und (A) die Spektralabszisse (A)=max Re i gut approximieren. Beispiele findet man für konvergente Matrizen wo (A)A<1 gewünscht ist, und für stabile Matrizen wo (A)(A)<0 zu zeigen ist. Wir untersuchen hier, wie weit man mit Diagonaltransformationen und dengewöhnlichsten Normen kommen kann.
  相似文献   

5.
Consideration was given to scheduling by the criterion for uniform use of resources. It was assumed that each job is executed by a unit resource. Two types of inter-job dependences were studied: finish–finish (one job cannot be completed until the other is completed) and finish–start (one job cannot be started until the other is not completed). To solve the problem, a geometrical method reducing solution to determining the shortest trajectory in a domain constructed from the network graph was proposed.  相似文献   

6.
This work is about a real-world application of automated deduction. The application is the management of documents (such as mathematical textbooks) as they occur in a readily available tool. In this Slicing Information Technology tool, documents are decomposed (sliced) into small units. A particular application task is to assemble a new document from such units in a selective way, based on the user's current interest and knowledge. It is argued that this task can be naturally expressed through logic, and that automated deduction technology can be exploited for solving it. More precisely, we rely on first-order clausal logic with some default negation principle, and we propose a model computation theorem prover as a suitable deduction mechanism. Beyond solving the task at hand as such, with this work we contribute to the quest for arguments in favor of automated deduction techniques in the real world. Also, we argue why we think that automated deduction techniques are the best choice here.  相似文献   

7.
Summary We present here an axiomatic approach which enables one to prove by formal methods that his program is totally correct (i.e., it terminates and is logically correct—does what it is supposed to do). The approach is similar to Hoare's approach [3] for proving that a program is partially correct (i.e., that whenever it terminates it produces correct results). Our extension to Hoare's method lies in the possibility of proving both correctness and termination by one unified formalism. One can choose to prove total correctness by a single step, or by incremental proof steps, each step establishing more properties of the program.  相似文献   

8.
In many applications one has a set of discrete points at which some variable such as pressure or velocity is measured. In order to graphically represent and display such data (say, as contours of constant pressure), the discrete data must be represented by a smooth function. This continuous surface can then be evaluated at any point for graphical display. Sometimes data are arbitrarily located except that they occur along non-intersecting lines, an example occurring in wind tunnel tests where data are recorded at plug taps on an aircraft body. An algorithm is developed for this type of structured data problem and illustrated by means of color computer graphics.  相似文献   

9.
In a model for a measure of computational complexity, , for a partial recursive functiont, letR t denote all partial recursive functions having the same domain ast and computable within timet. Let = {R t |t is recursive} and let = { |i is actually the running time function of a computation}. and are partially ordered under set-theoretic inclusion. These partial orderings have been extensively investigated by Borodin, Constable and Hopcroft in [3]. In this paper we present a simple uniform proof of some of their results. For example, we give a procedure for easily calculating a model of computational complexity for which is not dense while is dense. In our opinion, our technique is so transparent that it indicates that certain questions of density are not intrinsically interesting for general abstract measures of computational complexity, . (This is not to say that similar questions are necessarily uninteresting for specific models.)Supported by NSF Research Grants GP6120 and GJ27127.  相似文献   

10.
This paper describes an approach for tracking rigid and articulated objects using a view-based representation. The approach builds on and extends work on eigenspace representations, robust estimation techniques, and parameterized optical flow estimation. First, we note that the least-squares image reconstruction of standard eigenspace techniques has a number of problems and we reformulate the reconstruction problem as one of robust estimation. Second we define a subspace constancy assumption that allows us to exploit techniques for parameterized optical flow estimation to simultaneously solve for the view of an object and the affine transformation between the eigenspace and the image. To account for large affine transformations between the eigenspace and the image we define a multi-scale eigenspace representation and a coarse-to-fine matching strategy. Finally, we use these techniques to track objects over long image sequences in which the objects simultaneously undergo both affine image motions and changes of view. In particular we use this EigenTracking technique to track and recognize the gestures of a moving hand.  相似文献   

11.
We provide techniques to integrate resolution logic with equality in type theory. The results may be rendered as follows. A clausification procedure in type theory, equipped with a correctness proof, all encoded using higher-order primitive recursion. A novel representation of clauses in minimal logic such that the -representation of resolution steps is linear in the size of the premisses. A translation of resolution proofs into lambda terms, yielding a verification procedure for those proofs. Availability of the power of resolution theorem provers in interactive proof construction systems based on type theory.  相似文献   

12.
Semantics connected to some information based metaphor are well-known in logic literature: a paradigmatic example is Kripke semantic for Intuitionistic Logic. In this paper we start from the concrete problem of providing suitable logic-algebraic models for the calculus of attribute dependencies in Formal Contexts with information gaps and we obtain an intuitive model based on the notion of passage of information showing that Kleene algebras, semi-simple Nelson algebras, three-valued ukasiewicz algebras and Post algebras of order three are, in a sense, naturally and directly connected to partially defined information systems. In this way wecan provide for these logic-algebraic structures a raison dêetre different from the original motivations concerning, for instance, computability theory.  相似文献   

13.
The temporal property to-always has been proposed for specifying progress properties of concurrent programs. Although the to-always properties are a subset of the leads-to properties for a given program, to-always has more convenient proof rules and in some cases more accurately describes the desired system behavior. In this paper, we give a predicate transformerwta, derive some of its properties, and use it to define to-always. Proof rules for to-always are derived from the properties ofwta. We conclude by briefly describing two application areas, nondeterministic data flow networks and self-stabilizing systems where to-always properties are useful.  相似文献   

14.
This paper deals with the issue of generating one Pareto optimal point that is guaranteed to be in a desirable part of the Pareto set in a given multicriteria optimization problem. A parameterization of the Pareto set based on the recently developed normal-boundary intersection technique is used to formulate a subproblem, the solution of which yields the point of maximum bulge, often referred to as the knee of the Pareto curve. This enables the identification of the good region of the Pareto set by solving one nonlinear programming problem, thereby bypassing the need to generate many Pareto points. Further, this representation extends the concept of the knee for problems with more than two objectives. It is further proved that this knee is invariant with respect to the scales of the multiple objective functions.The generation of this knee however requires the value of each objective function at the minimizer of every objective function (the pay-off matrix). The paper characterizes situations when approximations to the function values comprising the pay-off matrix would suffice in generating a good approximation to the knee. Numerical results are provided to illustrate this point. Further, a weighted sum minimization problem is developed based on the information in the pay-off matrix, by solving which the knee can be obtained.  相似文献   

15.
New algorithms for stochastic approximation under input disturbance are designed. For the multidimensional case, they are simple in form, generate consistent estimates for unknown parameters under almost arbitrary disturbances, and are easily incorporated in the design of quantum devices for estimating the gradient vector of a function of several variables.  相似文献   

16.
This paper describes a unified variational theory for design sensitivity analysis of nonlinear dynamic response of structural and mechanical systems for shape, nonshape, material and mechanical properties selection, as well as control problems. The concept of an adjoint system, the principle of virtual work and a Lagrangian-Eulerian formulation to describe the deformations and the design variations are used to develop a unified view point. A general formula for design sensitivity analysis is derived and interpreted for usual performance functionals. Analytical examples are utilized to demonstrate the use of the theory and give insights for application to more complex problems that must be treated numerically.Derivatives The comma notation for partial derivatives is used, i.e. G,u = G/u. An upper dot represents material time derivative, i.e. ü = 2u/t2. A prime implies derivative with respect to the time measured in the reference time-domain, i.e. u = du/d.  相似文献   

17.
One major task in requirements specification is to capture the rules relevant to the problem at hand. Declarative, rule-based approaches have been suggested by many researchers in the field. However, when it comes to modeling large systems of rules, not only for the behavior of the computer system but also for the organizational environment surrounding it, current approaches have problems with limited expressiveness, flexibility, and poor comprehensibility. Hence, rule-based approaches may benefit from improvements in two directions: (1) improvement of the rule languages themselves and (2) better integration with other, complementary modeling approaches.In this article, both issues are addressed in an integrated manner. The proposal is presented in the context of the Tempora project on rule-based information systems development, but has also been integrated with PPP. Tempora has provided a rule language based on an executable temporal logic working on top of a temporal database. The rule language is integrated with static (ER-like) and dynamic (SA/RT-like) modeling approaches. In the current proposal, the integration with complementary modeling approaches is extended by including organization modeling (actors, roles), and the expressiveness of the rule language is increased by introducing deontic operators and rule hierarchies. The main contribution of the article is not seen as any one of the above-mentioned extensions, but as the resulting comprehensive modeling support. The approach is illustrated by examples taken from an industrial case study done in connection with Tempora.C. List of Symbols Subset of set - Not subset of set - Element of set - Not element of set - Equivalent to - Not equivalent to - ¬ Negation - Logical and - Logical or - Implication - Sometime in past - Sometime in future - Always in past - Always in future - Just before - Just after - u Until - s Since - Trigger - Condition - s State condition - Consequence - a Action - s State - Role - Actor - ¬ - General deontic operator - O Obligatory - R Recommended - P Permitted - D Discouraged - F Forbidden - (/–) General rule - t R Real time - t M Model time  相似文献   

18.
Summary When searching unsuccessfully for a fixed element in a random binary search tree, the number of comparisons made whose result is less is independent from the number of comparisons whose result is greater. This principle can be used to compute the mean and variance of the total number of comparisons involved in both a successful and an unsuccessful search.This work was supported in part by a Hertz Graduate Fellowship and by the Xerox Palo Alto Research Center.  相似文献   

19.
The adaptiveness of agents is one of the basic conditions for the autonomy. This paper describes an approach of adaptiveness forMonitoring Cognitive Agents based on the notion of generic spaces. This notion allows the definition of virtual generic processes so that any particular actual process is then a simple configuration of the generic process, that is to say a set of values of parameters. Consequently, generic domain ontology containing the generic knowledge for solving problems concerning the generic process can be developed. This lead to the design of Generic Monitoring Cognitive Agent, a class of agent in which the whole knowledge corpus is generic. In other words, modeling a process within a generic space becomes configuring a generic process and adaptiveness becomes genericity, that is to say independence regarding technology. In this paper, we present an application of this approach on Sachem, a Generic Monitoring Cognitive Agent designed in order to help the operators in operating a blast furnace. Specifically, the NeuroGaz module of Sachem will be used to present the notion of a generic blast furnace. The adaptiveness of Sachem can then be noted through the low cost of the deployment of a Sachem instance on different blast furnaces and the ability of NeuroGaz in solving problem and learning from various top gas instrumentation.  相似文献   

20.
Summary Many reductions among combinatorial problems are known in the context of NP-completeness. These reductions preserve the optimality of solutions. However, they may change the relative error of approximative solutions dramatically. In this paper, we apply a new type of reductions, called continuous reductions. When one problem is continuously reduced to another, any approximation algorithm for the latter problem can be transformed into an approximation algorithm for the former. Moreover, the performance ratio is preserved up to a constant factor. We relate the problem Minimum Number of Inverters in CMOS-Circuits, which arises in the context of logic synthesis, to several classical combinatorial problems such as Maximum Independent Set and Deletion of a Minimum Number of Vertices (Edges) in Order to Obtain a Bipartite (Partial) Subgraph.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号