首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We present a framework for intensional reasoning in typed -calculus. In this family of calculi, called Modal Pure Type Systems (MPTSs), a propositions-as-types-interpretation can be given for normal modal logics. MPTSs are an extension of the Pure Type Systems (PTSs) of Barendregt (1992). We show that they retain the desirable meta-theoretical properties of PTSs, and briefly discuss applications in the area of knowledge representation.  相似文献   

2.
The theory of Timed Transition Systems developed by Henzinger, Manna, and Pnueli provides a formal framework for specifying and reasoning about real-time systems. In this paper, we report on some preliminary investigations into the mechanization of this theory using the HOL theorem prover.We review the main ideas of the theory and describe how it has been formally embedded in HOL. A graphical notation of timed transition diagrams and a real-time temporal logic for requirements have also been embedded in HOL using the embedding of timed transition systems. The proof rules proposed by Henzinger et al have been verified formally and we illustrate their use, as well as some problems we have encountered, by reference to a small example. More work is required on interfaces and proof methods to have a generally usable system.  相似文献   

3.
After Scott, mathematical models of the type-free lambda calculus are constructed by order theoretic methods and classified into semantics according to the nature of their representable functions. Selinger [48] asked if there is a lambda theory that is not induced by any non-trivially partially ordered model (order-incompleteness problem). In terms of Alexandroff topology (the strongest topology whose specialization order is the order of the considered model) the problem of order-incompleteness can be also characterized as follows: a lambda theory T is order-incomplete if, and only if, every partially ordered model of T is partitioned by the Alexandroff topology in an infinite number of connected components (= minimal upper and lower sets), each one containing exactly one element of the model. Towards an answer to the order-incompleteness problem, we give a topological proof of the following result: there exists a lambda theory whose partially ordered models are partitioned by the Alexandroff topology in an infinite number of connected components, each one containing at most one λ-term denotation. This result implies the incompleteness of every semantics of lambda calculus given in terms of partially ordered models whose Alexandroff topology has a finite number of connected components (e.g.the Alexandroff topology of the models of the continuous, stable and strongly stable semantics is connected).  相似文献   

4.
The accumulation calculs(AC for short)is an interval based temporal logic to specify and reason about hybrid real-time systems.This paper presents a formal proof system for AC,and proves that the system is complete relative to that of Interval Temporal Logic(ITL for short)on real domain.  相似文献   

5.
This paper reports on the first steps towards the formal verification of correctness proofs of real-life protocols in process algebra. We show that such proofs can be verified, and partly constructed, by a general purpose proof checker. The process algebra we use isCRL, ACP augmented with data, which is expressive enough for the specification of real-life protocols. The proof checker we use is Coq, which is based on the Calculus of Constructions, an extension of simply typed lambda calculus. The focus is on the translation of the proof theory ofCRL andCRL-specifications to Coq. As a case study, we verified the Alternating Bit Protocol.  相似文献   

6.
The calculus of constructions of Coquand, which is a version of higher order typed-calculus based on the dependent function type, is considered from the perspective of its use as the mathematical foundation for a proof development system. The paper considers formulations of the calculus, the underlying consistency of the formalism (i.e., the strong normalisation theorem), and the proof theory of adding assumptions for notions from logic and set theory. Proofs are not given, but references to them are.A preliminary version of this paper was presented at the Third Banff Higher Order Workshop, 23–28 September 1989.  相似文献   

7.
Intrusion Detection Systems(IDS) is an automated cyber security monitoring system to sense malicious activities.Unfortunately,IDS often generates both a considerable number of alerts and false positives in IDS logs.Information visualization allows users to discover and analyze large amounts of information through visual exploration and interaction efficiently.Even with the aid of visualization,identifying the attack patterns and recognizing the false positives from a great number of alerts are still challenges.In this paper,a novel visualization framework,IDSRadar,is proposed for IDS alerts,which can monitor the network and perceive the overall view of the security situation by using radial graph in real-time.IDSRadar utilizes five categories of entropy functions to quantitatively analyze the irregular behavioral patterns,and synthesizes interactions,filtering and drill-down to detect the potential intrusions.In conclusion,IDSRadar is used to analyze the mini-challenges of the VAST challenge 2011 and 2012.  相似文献   

8.
The aim of this study is to look at the the syntactic calculus of Bar-Hillel and Lambek, including semantic interpretation, from the point of view of constructive type theory. The syntactic calculus is given a formalization that makes it possible to implement it in a type-theoretical proof editor. Such an implementation combines formal syntax and formal semantics, and makes the type-theoretical tools of automatic and interactive reasoning available in grammar.In the formalization, the use of the dependent types of constructive type theory is essential. Dependent types are already needed in the semantics of ordinary Lambek calculus. But they also suggest some natural extensions of the calculus, which are applied to the treatment of morphosyntactic dependencies and to an analysis of selectional restrictions. Finally, directed dependent function types are introduced, corresponding to the types of constructive type theory.Two alternative formalizations are given: one using syntax trees, like Montague grammar, and one dispensing with them, like the theory called minimalistic by Morrill. The syntax tree approach is presented as the main alternative, because it makes it possible to embed the calculus in a more extensive Montague-style grammar.  相似文献   

9.
Craig interpolation has become a versatile tool in formal verification, used for instance to generate program assertions that serve as candidates for loop invariants. In this paper, we consider Craig interpolation for quantifier-free Presburger arithmetic (QFPA). Until recently, quantifier elimination was the only available interpolation method for this theory, which is, however, known to be potentially costly and inflexible. We introduce an interpolation approach based on a sequent calculus for QFPA that determines interpolants by annotating the steps of an unsatisfiability proof with partial interpolants. We prove our calculus to be sound and complete. We have extended the Princess theorem prover to generate interpolating proofs, and applied it to a large number of publicly available Presburger arithmetic benchmarks. The results document the robustness and efficiency of our interpolation procedure. Finally, we compare the procedure against alternative interpolation methods, both for QFPA and linear rational arithmetic.  相似文献   

10.
Type expressions may be used to describe the functional behavior of untyped lambda terms. We present a general semantics of polymorphic type expressions over models of untyped lambda calculus and give complete rules for inferring types for terms. Some simplified typing theories are studied in more detail, and containments between types are investigated.  相似文献   

11.
LJQ is a focused sequent calculus for intuitionistic logic,with a simple restriction on the first premiss of the usualleft introduction rule for implication. In a previous paperwe discussed its history (going back to about 1950, or beyond)and presented its basic theory and some applications; here wediscuss in detail its relation to call-by-value reduction inlambda calculus, establishing a connection between LJQ and theCBV calculus C of Moggi. In particular, we present an equationalcorrespondence between these two calculi forming a bijectionbetween the two sets of normal terms, and allowing reductionsin each to be simulated by reductions in the other.  相似文献   

12.
A lambda theory satisfies an equation between contexts, where a context is aλ-term with some “holes” in it, if all the instances of the equation fall within the lambda theory. In the main result of this paper it is shown that the equations (between contexts) valid in every lambda theory have an explicit finite equational axiomatization. The variety of algebras determined by the above equational theory is characterized as the class of isomorphic images of functional lambda abstraction algebras. These are algebras of functions and naturally arise as the “coordinatizations” of environment models or lambda models, the natural combinatory models of the lambda calculus. The main result of this paper is also applied to obtain a completeness theorem for the infinitary lambda calculus recently introduced by Berarducci.  相似文献   

13.
We show that linear-time self-interpretation of the pure untyped lambda calculus is possible, in the sense that interpretation has a constant overhead compared to direct execution under various execution models. The present paper shows this result for reduction to weak head normal form under call-by-name, call-by-value and call-by-need.We use a self-interpreter based on previous work on self-interpretation and partial evaluation of the pure untyped lambda calculus.We use operational semantics to define each reduction strategy. For each of these we show a simulation lemma that states that each inference step in the evaluation of a term by the operational semantics is simulated by a sequence of steps in evaluation of the self-interpreter applied to the term (using the same operational semantics).By assigning costs to the inference rules in the operational semantics, we can compare the cost of normal evaluation and self-interpretation. Three different cost-measures are used: number of beta-reductions, cost of a substitution-based implementation (similar to graph reduction) and cost of an environment-based implementation.For call-by-need we use a non-deterministic semantics, which simplifies the proof considerably.  相似文献   

14.
We present in this paper an application of the ACL2 system to generate and reason about propositional satisfiability provers. For that purpose, we develop a framework in which we define a generic S AT-prover based on transformation rules, and we formalize this generic framework in the ACL2 logic, carrying out a formal proof of its termination, soundness, and completeness. This generic framework can be instantiated to obtain a number of verified and executable SAT-provers in ACL2, and this instantiation can be done in an automated way. Three instantiations of the generic framework are considered: semantic tableaux, sequent calculus, and Davis-Putnam-Logeman-Loveland methods.  相似文献   

15.
Coloring terms (rippling) is a technique developed for inductive theorem proving that uses syntactic differences of terms to guide the proof search. Annotations (colors) to symbol occurrences in terms are used to maintain this information. This technique has several advantages; for example, it is highly goal oriented and involves little search. In this paper we give a general formalization of coloring terms in a higher-order setting. We introduce a simply typed calculus with color annotations and present appropriate algorithms for the general, pre-, and pattern unification problems. Our work is a formal basis to the implementation of rippling in a higher-order setting, which is required, for example, in the case of middle-out reasoning. Another application is in the construction of natural the language semantics, where the color annotations rule out linguistically invalid readings that are possible using standard higher-order unification.  相似文献   

16.
The region calculus of Tofte and Talpin is a polymorphically typed lambda calculus with annotations that make memory allocation and deallocation explicit. It is intended as an intermediate language for implementing Hindley-Milner typed functional languages such as ML without traditional trace-based garbage collection. Static region and effect inference can be used to annotate a statically typed ML program with memory management primitives. Soundness of the calculus with respect to the region and effect system is crucial to guarantee safe deallocation of regions, i.e., deallocation should only take place for objects which are provably dead. The original soundness proof by Tofte and Talpin requires a complex co-inductive safety relation. In this paper, we present two small-step operational semantics for the region calculus and prove their type soundness with respect to the region and effect system. Following the standard syntactic approach of Wright, Felleisen, and Harper, we obtain simple inductive proofs. The first semantics is store-less. It is simple and elegant and gives rise to perspicuous proofs. The second semantics provides a store-based model for the region calculus. Albeit slightly more complicated, its additional expressiveness allows us to model operations on references with destructive update. A pure fragment of both small-step semantics is then proven equivalent to the original big-step operational approach of Tofte and Talpin. This leads to an alternative soundness proof for their evaluation-style formulation.  相似文献   

17.
事件演算在行动推理中的应用   总被引:1,自引:1,他引:0  
事件演算是基于一阶谓词演算的行动推理理论.它可作为描述事件的一个工具,在行动推理的应用中显示出其强大的表示能力和实现能力.在事件演算中,可以对行动进行公理化,可以描述行动的时间性、并发性、连续变化及知识,而且还可用Prolog实现.讨论介绍与这些应用相关的基本概念、思想和方法等,并且通过一个送咖啡的例子说明了如何通过事件演算来描述和实现.  相似文献   

18.
A new proof of the analogue of Böhm's Theorem in the typed lambda calculus with functional types is given.  相似文献   

19.
This paper presents a mechanisation of psi-calculi, a parametric framework for modelling various dialects of process calculi including (but not limited to) the pi-calculus, the applied pi-calculus, and the spi calculus. psi-calculi are significantly more expressive, yet their semantics is as simple in structure as the semantics of the original pi-calculus. Proofs of meta-theoretic properties for psi-calculi are more involved, however, not least because psi-calculi (unlike simpler calculi) utilise binders that bind multiple names at once. The mechanisation is carried out in the Nominal Isabelle framework, an interactive proof assistant designed to facilitate formal reasoning about calculi with binders. Our main contributions are twofold. First, we have developed techniques that allow efficient reasoning about calculi that bind multiple names in Nominal Isabelle. Second, we have adopted these techniques to mechanise substantial results from the meta-theory of psi-calculi, including congruence properties of bisimilarity and the laws of structural congruence. To our knowledge, this is the most extensive formalisation of process calculi mechanised in a proof assistant to date.  相似文献   

20.
A general reducibility method is developed for proving reduction properties of lambda terms typeable in intersection type systems with and without the universal type Ω. Sufficient conditions for its application are derived. This method leads to uniform proofs of confluence, standardization, and weak head normalization of terms typeable in the system with the type Ω. The method extends Tait's reducibility method for the proof of strong normalization of the simply typed lambda calculus, Krivine's extension of the same method for the strong normalization of intersection type system without Ω, and Statman-Mitchell's logical relation method for the proof of confluence of βη-reduction on the simply typed lambda terms. As a consequence, the confluence and the standardization of all (untyped) lambda terms is obtained.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号