首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A focused proof system provides a normal form to cut-free proofs in which the application of invertible and non-invertible inference rules is structured. Within linear logic, the focused proof system of Andreoli provides an elegant and comprehensive normal form for cut-free proofs. Within intuitionistic and classical logics, there are various different proof systems in the literature that exhibit focusing behavior. These focused proof systems have been applied to both the proof search and the proof normalization approaches to computation. We present a new, focused proof system for intuitionistic logic, called LJF, and show how other intuitionistic proof systems can be mapped into the new system by inserting logical connectives that prematurely stop focusing. We also use LJF to design a focused proof system LKF for classical logic. Our approach to the design and analysis of these systems is based on the completeness of focusing in linear logic and on the notion of polarity that appears in Girard’s LC and LU proof systems.  相似文献   

2.
Relative least general generalization proposed by Plotkin, is widely used for generalizing first-order clauses in Inductive Logic Programming, and this paper describes an extension of Plotkin’s work to allow various computation domains: Herbrand Universe, sets, numerical data, ect. The ?-subsumption in Plotkin’s framework is replaced by a more general constraint-based subsumption. Since this replacement is analogous to that of unification by constraint solving in Constraint Logic Programming, the resultant method can be viewed as a Constraint Logic Programming version of relative least general generalization. Constraint-based subsumption, however, leads to a search on an intractably large hypothesis space. We therefore providemeta-level constraints that are used as semantic bias on the hypothesis language. The constraintsfunctional dependency andmonotonicity are introduced by analyzing clausal relationships. Finally, the advantage of the proposed method is demonstrated through a simple layout problem, where geometric constraints used in space planning tasks are produced automatically.  相似文献   

3.
We present a new framework for amalgamating two successful programming paradigms: logic programming and object-oriented programming. From the former, we keep the delarative reading of programs. From the latter, we select two crucial notions: (i) the ability for objects to dynamically change their internal state during the computation; (ii) the structured representation of knowledge, generally obtained via inheritance graphs among classes of objects. We start with the approach, introduced in concurrent logic programming languages, which identifies objects with proof processes and object states with arguments occurring in the goal of a given process. This provides a clean, side-effect free account of the dynamic behavior of objects in terms of the search tree—the only dynamic entity in logic programming languages. We integrate this view of objects with an extension of logic programming, which we call Linear Objects, based on the possibility of having multiple literals in the head of a program clause. This contains within itself the basis for a flexible form of inheritance, and maintains the constructive property of Prolog of returning definite answer substitutions as output of the proof of non-ground goals. The theoretical background for Linear Objects is Linear Logic, a logic recently introduced to provide a theoretical basis for the study of concurrency. We also show that Linear Objects can be considered as constructive restriction of full Classical Logic. We illustrate the expressive power of Linear Objects compared to Prolog by several examples from the object-oriented domain, but we also show that it can be used to provide elegant solutions for problems arising in the standard style of logic programming.  相似文献   

4.
New, simple, proofs of soundness (every representable function lies in a given complexity class) for Elementary Affine Logic, LFPL and Soft Affine Logic are presented. The proofs are obtained by instantiating a semantic framework previously introduced by the authors and based on an innovative modification of realizability. The proof is a notable simplification on the original already semantic proof of soundness for the above mentioned logical systems and programming languages. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL, thus allowing for an internal definition of inductive datatypes. The methodology presented proceeds by assigning both abstract resource bounds in the form of elements from a resource monoid and resource-bounded computations to proofs (respectively, programs).  相似文献   

5.
We describebarnacle: a co-operative interface to theclaminductive theorem proving system. For the foreseeable future, there will be theorems which cannot be proved completely automatically, so the ability to allow human intervention is desirable; for this intervention to be productive the problem of orienting the user in the proof attempt must be overcome. There are many semi-automatic theorem provers: we call our style of theorem provingco-operative, in that the skills of both human and automaton are used each to their best advantage, and used together may find a proof where other methods fail. The co-operative nature of thebarnacleinterface is made possible by the proof planning technique underpinningclam. Our claim is that proof planning makes new kinds of user interaction possible.Proof planning is a technique for guiding the search for a proof in automatic theorem proving. Common patterns of reasoning in proofs are identified and represented computationally as proof plans, which can then be used to guide the search for proofs of new conjectures. We have harnessed the explanatory power of proof planning to enable the user to understand where the automatic prover got to and why it is stuck. A user can analyse the failed proof in terms ofclam's specification language, and hence override the prover to force or prevent the application of a tactic, or discover a proof patch. This patch might be to apply further rules or tactics to bridge the gap between the effects of previous tactics and the preconditions needed by a currently inapplicable tactic.  相似文献   

6.
Inspired by the Multiplicative Exponential fragment of Linear Logic, we define a framework called the prismoid of resources where each vertex is a language which refines the λ-calculus by using a different choice to make explicit or implicit (meta-level) the definition of the contraction, weakening, and substitution operations. For all the calculi in the prismoid we show simulation of β-reduction, confluence, preservation of β-strong normalisation and strong normalisation for typed terms. Full composition also holds for all the calculi of the prismoid handling explicit substitutions. The whole development of the prismoid is done by making the set of resources a parameter of the formalism, so that all the properties for each vertex are obtained as a particular case of the general abstract proofs.  相似文献   

7.
We describe two uses of meta-level inference: to control the search for a proof; and to derive new control information, and illustrate them in the domain of algebraic equation solving. The derivation of control information is the main focus of the paper. It involves the proving of theorems in the Meta-Theory of Algebra. These proofs are guided by meta-meta-level inference. We are developing a meta-meta-language to describe formulae, and proof plans, and have built a program, IMPRESS, which uses these plans to build a proof. We describe one such proof plan in detail. IMPRESS will form part of a self-improving algebra system.  相似文献   

8.
We extend the notion of anti-unification to cover equational theories and present a method based on regular tree grammars to compute a finite representation of E-generalization sets. We present a framework to combine Inductive Logic Programming and E-generalization that includes an extension of Plotkin's lgg theorem to the equational case. We demonstrate the potential power of E-generalization by three example applications: computation of suggestions for auxiliary lemmas in equational inductive proofs, computation of construction laws for given term sequences, and learning of screen editor command sequences.  相似文献   

9.
Dijkstra argues that calculational proofs are preferable to traditional pictorial and/or verbal proofs. First, due to the calculational proof format, incorrect proofs are less likely. Second, syntactic considerations (letting the “symbols do the work”) have led to an impressive array of techniques for elegant proof construction. However, calculational proofs are not formal and are not flawless. Why not make them formal and check them mechanically?  相似文献   

10.
Compositional verification aims at managing the complexity of theverification process by exploiting compositionality of the systemarchitecture. In this paper we explore the use of a temporal epistemiclogic to formalize the process of verification of compositionalmulti-agent systems. The specification of a system, its properties andtheir proofs are of a compositional nature, and are formalized within acompositional temporal logic: Temporal Multi-Epistemic Logic. It isshown that compositional proofs are valid under certain conditions.Moreover, the possibility of incorporating default persistence ofinformation in a system, is explored. A completion operation on aspecific type of temporal theories, temporal completion, is introducedto be able to use classical proof techniques in verification withrespect to non-classical semantics covering default persistence.  相似文献   

11.
This paper presents a formalization in Coq of Common Knowledge Logic and checks its adequacy on case studies. Those studies allow exploring experimentally the proof-theoretic side of Common Knowledge Logic. This work is original in that nobody has considered Higher Order Common Knowledge Logic from the point of view of proofs performed on a proof assistant. As a matter of facts, it is experimental by nature as it tries to draw conclusions from experiments.   相似文献   

12.
We define realizability semantics for Light Affine Logic ( LAL\mathsf{LAL} ) which has the property that denotations of functions are polynomial time computable by construction of the model. This gives a new proof of polytime-soundness of LAL\mathsf{LAL} which is considerably simpler than the standard proof based on proof nets and is entirely semantical in nature. The model construction uses a new instance of a resource monoid; a general method for interpreting systems based on Linear Logic introduced earlier by the authors.  相似文献   

13.
14.
Verification methods based on SAT, SMT, and theorem proving often rely on proofs of unsatisfiability as a powerful tool to extract information in order to reduce the overall effort. For example a proof may be traversed to identify a minimal reason that led to unsatisfiability, for computing abstractions, or for deriving Craig interpolants. In this paper we focus on two important aspects that concern efficient handling of proofs of unsatisfiability: compression and manipulation. First of all, since the proof size can be very large in general (exponential in the size of the input problem), it is indeed beneficial to adopt techniques to compress it for further processing. Secondly, proofs can be manipulated as a flexible preprocessing step in preparation for interpolant computation. Both these techniques are implemented in a framework that makes use of local rewriting rules to transform the proofs. We show that a careful use of the rules, combined with existing algorithms, can result in an effective simplification of the original proofs. We have evaluated several heuristics on a wide range of unsatisfiable problems deriving from SAT and SMT test cases.  相似文献   

15.
This paper proposes a new proof method for strong normalization of classical natural deduction and calculi with control operators. For this purpose, we introduce a new CPS-translation, continuation and garbage passing style (CGPS ) translation. We show that this CGPS-translation method gives simple proofs of strong normalization of λμ∧∨⊥, which is introduced in [P. de Groote, Strong normalization of classical natural deduction with disjunction, in: S. Abramsky (Ed.), Typed Lambda Calculi and Applications, 5th International Conference, TLCA 2001, in: Lecture Notes in Comput. Sci., vol. 2044, Springer, Berlin, 2001, pp. 182-196] by de Groote and corresponds to the classical natural deduction with disjunctions and permutative conversions.  相似文献   

16.
It is known that constant-depth Frege proofs of some tautologies require exponential size. No such lower bound result is known for more general proof systems. We consider tree-like sequent calculus proofs in which formulas can contain modular connectives and only the cut formulas are restricted to be of constant depth. Under a plausible hardness assumption concerning small-depth Boolean circuits, we prove exponential lower bounds for such proofs. We prove these lower bounds directly from the computational hardness assumption. We start with a lower bound for cut-free proofs and “lift” it so it applies to proofs with constant-depth cuts. By using the same approach, we obtain the following additional results. We provide a much simpler proof of a known unconditional lower bound in the case where modular connectives are not used. We establish a conditional exponential separation between the power of constant-depth proofs that use different modular connectives. We show that these tree-like proofs with constant-depth cuts cannot polynomially simulate similar dag-like proofs, even when the dag-like proofs are cut-free. We present a new proof of the non-finite axiomatizability of the theory of bounded arithmetic I Δ0(R). Finally, under a plausible hardness assumption concerning the polynomial-time hierarchy, we show that the hierarchy \({G_i^*}\) of quantified propositional proof systems does not collapse.  相似文献   

17.
A class of mappings called abstractions are defined, and examples of abstractions are given. These functions map a set S of clauses onto a possibly simpler set T of clauses. Also, resolution proofs from S map onto possibly simpler resolution proofs from T. In order to search for a proof of a clause C from S, it suffices to search for a proof from T and attempt to invert the abstraction mapping to obtain a proof of C from S. Some theorem proving strategies based on this idea are presented. Most of these strategies are complete. A method of using more than one abstraction at the same time is also presented. This requires the use of ‘multiclauses’, which are multisets of literals, and associated ‘m-abstraction mappings’ on multiclauses. Certain abstractions are especially interesting, because they correspond to particular interpretations of the set S of clauses. The use of abstractions permits the advantages of set-of-support strategies to be realized in arbitrary complete non set-of-support resolution strategies.  相似文献   

18.
《Information Systems》2006,31(4-5):340-360
This paper introduces a method for automatic composition of Semantic Web services using Linear Logic (LL) theorem proving. The method uses a Semantic Web service language (DAML-S) for external presentation of Web services, while, internally, the services are presented by extralogical axioms and proofs in LL. LL, as a resource conscious logic, enables us to capture the concurrent features of Web services formally (including parameters, states and non-functional attributes). We use a process calculus to present the process model of the composite service. The process calculus is attached to the LL inference rules in the style of type theory. Thus, the process model for a composite service can be generated directly from the complete proof. We introduce a set of subtyping rules that defines a valid dataflow for composite services. The subtyping rules that are used for semantic reasoning are presented with LL inference figures. We propose a system architecture where the DAML-S Translator, LL Theorem Prover and Semantic Reasoner can operate together. This architecture has been implemented in Java.  相似文献   

19.
Metamathematics is a source of many interesting theorems and difficult proofs. This paper reports the results of an experiment to use the Boyer-Moore theorem prover to proof-check theorems in metamathematics. We describe a First Order Logic due to Shoenfield and outline some of the theorems that the prover was able to prove about this logic. These include the tautology theorem which states that every tautology has a proof. Such proofs can be used to add new proof procedures to a proof-checking program in a sound and efficient manner.  相似文献   

20.
This paper explores locality in proofs of global safety properties of concurrent programs. Model checking on the full state space is often infeasible due to state explosion. A local proof, in contrast, is a collection of per-process invariants, which together imply the desired global safety property. Local proofs can be more compact than global proofs, but local reasoning is also inherently incomplete. In this paper, we present an algorithm for safety verification that combines local reasoning with gradual refinement. The algorithm gradually exposes facts about the internal state of components, until either a local proof or a real error is discovered. The refinement mechanism ensures completeness. Experiments show that local reasoning can have significantly better performance over the traditional reachability computation. Moreover, for some parameterized protocols, a local proof can be used as the basis of a correctness proof over all instances.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号