首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
This paper presents an overview of the main results of the project Verification of ERLANG Programs , which is funded by the Swedish Business Development Agency (NUTEK) and by Ericsson within the ASTEC (Advanced Software TEChnology) initiative. Its main outcome is the ERLANG Verification Tool (EVT), a theorem prover which assists in obtaining proofs that ERLANG applications satisfy their correctness requirements formulated as behavioural properties in a modal logic with recursion. We give a summary of the verification framework as supported by EVT, discuss reasoning principles essential for successful proofs such as inductive and compositional reasoning, and an efficient treatment of side-effect-free code. The experiences of applying the tool in an industrial case study are summarised, and an approach for supporting verification in the presence of program libraries is outlined.EVT is essentially a classical proof assistant, or theorem-proving tool, requiring users to intervene in the proof process at crucial steps such as stating program invariants. However, the tool offers considerable support for automatic proof discovery through higher-level tactics tailored to the particular task of the verification of ERLANG programs. In addition, a graphical interface permits easy navigation through proof tableaux, proof reuse, and meaningful feedback about the current proof state, to assist users in taking informed proof decisions.  相似文献   

3.
Eliminating the substitution axiom from UNITY logic   总被引:2,自引:0,他引:2  
The UNITY substitution axiom, if (x=y) is an invariant of a program, then x can be replaced by y in any property of the program, is problematic for several reasons. In this paper, dual predicate transformerssst andwst are introduced that allow the strongest invariant of a program to be expressed, and these are used to give new definitions for the temporal operatorsunless andensures. With the new definitions, the substitutionaxiom is no longer needed, and can be replaced by a derived rule of inference which is formally justified in the logic. One important advantage is that the effects of the initial conditions on the properties of a program are formally captured in a convenient way, and one can forget about substitution in formal treatments of the UNITY proof system while still having it available when desirable to use during the derivation of programs. Composibility and completeness of the modified logic are also discussed.  相似文献   

4.
A compositional network proof theory to specify and verify properties of fault-tolerant real-time distributed systems with limited resources is presented. In this theory a conceptual scheduler grants the resource using on-line preemptive priority scheduling where the priority is a function of the initial priority and the time spent waiting for the resource. The method enables reasoning about responsive systems which must respond to external inputs in a timely, dependable, and predictable manner. It allows us to abstract from the precise nature and occurrence of faults and to focus on how they affect the externally visible input and output behaviour. To this end a failure hypothesis is formalized as a relation between the system's normal behaviour (i.e., the behaviour when no faults occur) and its acceptable behaviour, that is, the normal behaviour together with the exceptional behaviour (i.e., the behaviour whose abnormality should be tolerated). The proof theory is compositional to allow reasoning with the specifications of processes while ignoring their implementation details.Supported by the Dutch STW under grant number NWI88.1517: Fault Tolerance: Paradigms, Models, Logics, Construction.  相似文献   

5.
This paper provides a survey of possibilistic logic as a simple and efficient tool for handling nonmonotonic reasoning, with some emphasis on algorithmic issues. In our previous works, two well-known nonmonotonic systems have been encoded in the possibility theory framework: the preferential inference based on System P, and the rational closure inference proposed by Lehmann and Magidor which relies on System P augmented with a rational monotony postulate. System P is known to provide reasonable but very cautious conclusions, and in particular, preferential inference is blocked by the presence of irrelevant properties. When using Lehmann's rational closure, the inference machinery, which is then more productive, may still remain too cautious, or on the contrary, provide counter -intuitive conclusions. The paper proposes an approach to overcome the cautiousness of System P and the problems encountered by the rational closure inference. This approach takes advantage of (contextual) independence assumptions of the form: the fact that is true (or is false) does not affect the validity of the rule normally if then . The modelling of such independence assumptions is discussed in the possibilistic framework. Moreover, we show that when a counter-intuitive conclusion of a set of defaults can be inferred, it is always possible to repair the set of defaults by adding suitable information so as to produce the desired conclusions and block unsuitable ones.  相似文献   

6.
7.
This paper contributes to the solution of several open problems with parallel programming tools and their integration with performance evaluation environments. First, we propose interactive compilation scenarios instead of the usual black-box-oriented use of compiler tools. In such scenarios, information gathered by the compiler and the compiler's reasoning are presented to the user in meaningful ways and on-demand. Second, a tight integration of compilation and performance analysis tools is advocated. M any of the existing, advanced instruments for gathering performance results are being used in the presented environment and their results are combined in integrated views with compiler information and data from other tools. Initial instruments that assist users in data mining this information are presented and the need for much stronger facilities is explained. The URSA Family provides two tools addressing these issues. URSA MINOR supports a group of users at a specific site, such as a research or development project. URSA MAJOR complements this tool by making available the gathered results to the user community at large via the World-wide Web. This paper presents objectives, functionality, experience, and next development steps of the URSA tool family. Two case studies are presented that illustrate the use of the tools for developing and studying parallel applications and for evaluating parallelizing compilers.  相似文献   

8.
The relationship between Lexical-Functional Grammar (LFG) functional structures (f-structures) for sentences and their semanticinterpretations can be formalized in linear logic in a way thatcorrectly explains the observed interactions between quantifier scopeambiguity, bound anaphora and intensionality.Our linear-logic formalization of the compositional properties ofquantifying expressions in natural language obviates the need forspecial mechanisms, such as Cooper storage, in representing thescoping possibilities of quantifying expressions. Instead, thesemantic contribution of a quantifier is recorded as a linear-logicformula whose use in a proof will establish the scope of thequantifier. Different proofs can lead to different scopes. In eachcomplete proof, the properties of linear logic ensure thatquantifiers are properly scoped.The interactions between quantified NPs and intensional verbs such asseek are also accounted for in this deductive setting. A singlespecification in linear logic of the argument requirements ofintensional verbs is sufficient to derive the correct readingpredictions for intensional-verb clauses both with nonquantified andwith quantified direct objects. In particular, both de dictoand de re readings are derived for quantified objects. Theeffects of type-raising or quantifying-in rules in other frameworksjust follow here as linear-logic theorems.While our approach resembles current categorial approaches inimportant ways (Moortgat, 1988, 1992a; Carpenter, 1993; Morrill, 1994)it differs from them in allowing the greater compositional flexibility ofcategorial semantics (van Benthem, 1991)while maintaining a precise connection to syntax. As a result, we areable to provide derivations for certain readings of sentences withintensional verbs and complex direct objects whose derivation inpurely categorial accounts of the syntax-semantics interface appearsto require otherwise unnecessary semantic decompositions of lexicalentries.  相似文献   

9.
Agent-based technology has been identified as an important approach for developing next generation manufacturing systems. One of the key techniques needed for implementing such advanced systems will be learning. This paper first discusses learning issues in agent-based manufacturing systems and reviews related approaches, then describes how to enhance the performance of an agent-based manufacturing system through learning from history (based on distributed case-based learning and reasoning) and learning from the future (through system forecasting simulation). Learning from history is used to enhance coordination capabilities by minimizing communication and processing overheads. Learning from the future is used to adjust promissory schedules through forecasting simulation, by taking into account the shop floor interactions, production and transportation time. Detailed learning and reasoning mechanisms are described and partial experimental results are presented.  相似文献   

10.
In this paper we present a modal approach to contrastive logic, the logic of contrasts as these appear in natural language conjunctions such as but. We use a simple modal logic, which is an extension of the well-knownS5 logic, and base the contrastive operators proposed by Francez in [2] on the basic modalities that appear in this logic. We thus obtain a logic for contrastive operators that is more in accord with the tradition of intensional logic, and that, moreover — we argue — has some more natural properties. Particularly, attention is paid to nesting contrastive operators. We show that nestings of but give quite natural results, and indicate how nestings of other contrastive operators can be done adequately. Finally, we discuss the example of the Hangman's Paradox and some similarities (and differences) with default reasoning. But but us no buts, as they say.Also partially supported by Nijmegen University, Toernooiveld, 6525 ED Nijmegen, The Netherlands.  相似文献   

11.
This article is the twenty-seventh of a series of articles discussing various open research problems in automated reasoning. The problem proposed for research asks one to find criteria that an automated reasoning program can apply to determine whether to attack a given question with reasoning by analogy. The imprecise term reasoning by analogy refers to a type of reasoning in which the type of proof being sought is sharply influenced by the style of proof that was successfully used to prove related theorems.This work was supported by the Office of Scientific Computing, U.S. Department of Energy, under Contract W-31-109-Eng-38.  相似文献   

12.
We show in this paper how procedures that update knowledge bases can naturally be adapted to a number of problems related to contextual reasoning. The fact that the update procedures are abductive in nature is favourably exploited to tackle problems related to human-computer dialogue systems. We consider as examples aspects of pronoun resolution,goal formulation , and the problem of restoring the consistency of a knowledge base after some knowledge update is carried out. We state these problems in terms of the update problem and abductive reasoning and show how procedures that update knowledge bases yield some interesting results. We also explain how these procedures can naturally be used to model various forms of hypothetical reasoning such as hypothesizing inconsistencies and performing some look ahead form of reasoning.We do not claim thaT the problems presented here are solved entirely within the update framework. However, we believe that the flexibility of the representation and of the problem-solving approach suggest that the problems could be solved by adding more details about each problem. What is most interesting in our understanding is that all the aforementioned problems are expressed and tackled within the same framework.  相似文献   

13.
A program analysis is compositional when the analysis result for a particular program fragment is obtained solely from the results for its immediate subfragments via some composition operator. This means the subfragments can be analyzed independently in any order. Many commonly used program analysis techniques (in particular, most abstract interpretations and most uses of the Hindley/Milner type system) are not compositional and require the entire text of a program for sound and complete analysis.System is a recent type system for the pure λ-calculus with intersection types and the new technology of expansion variables. System supports compositional analysis because it has the principal typings property and an algorithm based on the new technology of β-unification has been developed that finds these principal typings. In addition, for each natural number k, typability in the rank-k restriction of System is decidable, so a complete and terminating analysis algorithm exists for the rank-k restriction.This paper presents new understanding that has been gained from working with multiple implementations of System and β-unification-based analysis algorithms. The previous literature on System presented the type system in a way that helped in proving its more important theoretical properties, but was not as easy to follow as it could be. This paper provides a presentation of many aspects of System that should be clearer as well as a discussion of important implementation issues.  相似文献   

14.
The close association between higher order functions (HOFs) and algorithmic skeletons is a promising source of automatic parallelisation of programs. A theorem proving approach to discovering HOFs in functional programs is presented. Our starting point is proof planning, an automated theorem proving technique in which high-level proof plans are used to guide proof search. We use proof planning to identify provably correct transformation rules that introduce HOFs. The approach has been implemented in the Clam proof planner and tested on a range of examples. The work was conducted within the context of a parallelising compiler for Standard ML.Received February 2001Revised March 2004 and November 2004Accepted November 2004 by D. J. Cooke  相似文献   

15.
For over a decade, researchers in formal methods have tried to create formalisms that permit natural specification of systems and allow mathematical reasoning about their correctness. The availability of fully automated reasoning tools enables non-experts to use formal methods effectively—their responsibility reduces to specifying the model and expressing the desired properties. Thus, it is essential that these properties be represented in a language that is easy to use, sufficiently expressive and succinct. Linear-time temporal logic (LTL) is a formalism that has been used extensively by researchers for program specification and verification. One of the desired properties of LTL formulas is closure under stuttering. That is, we do not want the interpretation of formulas to change over traces where some states are repeated. This property is important from both practical and theoretical prospectives; all properties which are closed under stuttering can be expressed in LTL–X—a fragment of LTL without the next operator. However, it is often difficult to express properties in this fragment of LTL. Further, determining whether a given LTL property is closed under stuttering is PSPACE-complete. In this paper, we introduce a notion of edges of LTL formulas and present a formal theory of closure under stuttering. Edges allow natural modelling of systems with events. Our theory enables syntactic reasoning about whether the resulting properties are closed under stuttering. Finally, we apply the theory to the pattern-based approach of specifying temporal formulas.  相似文献   

16.
Vector and matrix clocks are extensively used in asynchronous distributed systems. This paper asks, how does the clock abstraction generalize? To address this problem, the paper motivates and proposes logical clocks of arbitrary dimensions. It then identifies and explores the conceptual link between such clocks and knowledge. It establishes the necessary and sufficient conditions on the size and dimension of clocks required to attain any specified level of knowledge about the timestamp of the most recent system state for which this is possible without using any messages in the clock protocol. The paper then gives algorithms to determine the timestamp of the latest system state about which a specified level of knowledge is attainable in a given system state, and to compute the timestamp of the earliest system state in which a specified level of knowledge about a given system state is attainable. The results are applicable to applications that deal with a certain class of properties, identified as monotonic properties.Received: 15 March 2002, Accepted: 15 January 2004, Published online: 9 June 2004This material is based on work supported by the National Science Foundation under Grant No. 9875617. A conference version appeared in [14] and the full technical report is in [13].  相似文献   

17.
The stable revivals model provides a new semantic framework for the process algebra Csp. The model has recently been added to the realm of established Csp models. Within the Csp context, it enhances the analysis of systems with regards to properties such as responsiveness and stuckness. These properties are essential in component based system design. In this paper we report on the implementation of different variants of the model within Csp-Prover. Based on Isabelle/HOL, Csp-Prover is an interactive proof tool for Csp refinement, which is generic in the underlying Csp model. On the practical side, our encoding of the model provides semi-automatic proof support for reasoning on responsiveness and stuckness. On the theoretical side, our implementation also yields a machine verification of the model 's soundness as well as of its expected properties.  相似文献   

18.
The paper sets out twenty proposals for the development and evaluation of Computer Assisted Language Learning (CALL) programs. These proposals emerge from special characteristics of language instruction and of the use of computers to assist in language instruction. We combine theoretically-based assumptions with empirical findings drawn from investigation of language courseware for Hebrew speakers in Israel. We first list four unique features of language instruction: (1) the object-language-meta-language distinction; (2) computer as written medium vs. language as primary spoken medium; (3) teaching of second language skills vs. linguistics; (4) the computer as an electronic tool vs. the computer as a cognitive entity simulating the speaker. We then show how these unique characteristics of language instruction (mother-tongue and foreign language) impose special proposals on language courseware. These proposals should be observed in the development of language courseware and in the evaluation of such programs. Clearly, these proposals integrate with general courseware proposals. Michal Ephratt (Ph.D., computational linguistics) completed post-doctoral studies at the University of Rochester. She has been on the staff of the Dept. of Hebrew Linguistics, University of Haifa, since 1988. Some of her publications include Root-Pattern Array: The Main Tool of Hebrew Word Formation (Hebrew University, 1985); and What's in a Joke? in Advances in Artificial Intelligence, Natural Language and Knowledge-based Systems (Springer-Verlag, 1990).This paper is based on work the author did as a consultant in the National Courseware Evaluation Department of the Ministry of Education. I wish to thank Esther Diamant, head of the department, for making the study possible.  相似文献   

19.
Separation of concerns is a well-established principle in software engineering that supports reuse by hiding complexity through abstraction mechanisms. The Abstract Design Views model was created with reuse in mind and allows the designer to apply separation of concerns in a software system from the design to the implementation. In this model, viewed objects represent the basic concern, i.e., the algorithms that provide the essential functionality relevant to an application domain, and viewer objects represent the special concerns related to other software issues, such as user interface presentation, synchronization, and timing. In this paper we use a reuse taxonomy to analyze and validate this model. Using this analysis and the properties of the relationship between viewer and viewed objects, called views, we also indicate how to map the views-based designs into implementations based on design patterns that satisfy the views properties. Finally, we show how to apply the principles of our approach, using views and the design patterns, to design e-commerce applications.  相似文献   

20.
Abstract argumentation   总被引:1,自引:0,他引:1  
In this paper we explore the thesis that the role of argumentation in practical reasoning in general and legal reasoning in particular is to justify the use of defeasible rules to derive a conclusion in preference to the use of other defeasible rules to derive a conflicting conclusion. The defeasibility of rules is expressed by means of non-provability claims as additional conditions of the rules.We outline an abstract approach to defeasible reasoning and argumentation which includes many existing formalisms, including default logic, extended logic programming, non-monotonic modal logic and auto-epistemic logic, as special cases. We show, in particular, that the admissibility semantics for all these formalisms has a natural argumentation-theoretic interpretation and proof procedure, which seem to correspond well with informal argumentation.In the admissibility semantics there is only one way for one argument to attack another, namely by undermining one of its non-provability claims. In this paper, we show how other kinds of attack between arguments, specifically how rebuttal and priority attacks, can be reduced to the undermining of non-provability claims.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号