首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
There are two approaches to formalizing the syntax of typed object languages in a proof assistant or programming language. The extrinsic approach is to first define a type that encodes untyped object expressions and then make a separate definition of typing judgements over the untyped terms. The intrinsic approach is to make a single definition that captures well-typed object expressions, so ill-typed expressions cannot even be expressed. Intrinsic encodings are attractive and naturally enforce the requirement that metalanguage operations on object expressions, such as substitution, respect object types. The price is that the metalanguage types of intrinsic encodings and operations involve non-trivial dependency, adding significant complexity. This paper describes intrinsic-style formalizations of both simply-typed and polymorphic languages, and basic syntactic operations thereon, in the Coq proof assistant. The Coq types encoding object-level variables (de Bruijn indices) and terms are indexed by both type and typing environment. One key construction is the boot-strapping of definitions and lemmas about the action of substitutions in terms of similar ones for a simpler notion of renamings. In the simply-typed case, this yields definitions that are free of any use of type equality coercions. In the polymorphic case, some substitution operations do still require type coercions, which we at least partially tame by uniform use of heterogeneous equality.  相似文献   

2.
A general reducibility method is developed for proving reduction properties of lambda terms typeable in intersection type systems with and without the universal type Ω. Sufficient conditions for its application are derived. This method leads to uniform proofs of confluence, standardization, and weak head normalization of terms typeable in the system with the type Ω. The method extends Tait's reducibility method for the proof of strong normalization of the simply typed lambda calculus, Krivine's extension of the same method for the strong normalization of intersection type system without Ω, and Statman-Mitchell's logical relation method for the proof of confluence of βη-reduction on the simply typed lambda terms. As a consequence, the confluence and the standardization of all (untyped) lambda terms is obtained.  相似文献   

3.
The polymorphic environment calculus is a polymorphic lambda calculus which enables us to treat environments as first-class citizens. In the calculus, environments are formalized as explicit substitutions, and the substitutions are included in the set of terms of the calculus. First, we introduce an untyped environment calculus, and we present a semantics of the calculus as a translation into the lambda calculus. Second, we propose a polymorphic type system for the environment calculus based on Damas-Milner's ML-polymorphic type system. In ML, polymorphism is allowed only in let-expressions; in the polymorphic environment calculus, polymorphism is provided with environment compositions. We prove a subject-reduction theorem for the type system. Third, a type-inference algorithm is given to the polymorphic environment calculus, and we establish its soundness, termination, and principal-typing theorem.  相似文献   

4.
We explore an axiomatized nominal approach to variable binding in Coq, using an untyped lambda-calculus as our test case. In our nominal approach, alpha-equality of lambda terms coincides with Coq's built-in equality. Our axiomatization includes a nominal induction principle and functions for calculating free variables and substitution. These axioms are collected in a module signature and proved sound using locally nameless terms as the underlying representation. Our experience so far suggests that it is feasible to work from such axiomatized theories in Coq and that the nominal style of variable binding corresponds closely with paper proofs. We are currently working on proving the soundness of a primitive recursion combinator and developing a method of generating these axioms and their proof of soundness from a grammar describing the syntax of terms and binding.  相似文献   

5.
Existing meta-programming languages operate on encodings of programs as data. This paper presents a new meta-programming language, based on an untyped lambda calculus, in which structurally reflective programming is supported directly, without any encoding. The language features call-by-value and call-by-name lambda abstractions, as well as novel reflective features enabling the intensional manipulation of arbitrary program terms. The language is scope safe, in the sense that variables can neither be captured nor escape their scopes. The expressiveness of the language is demonstrated by showing how to implement quotation and evaluation operations, as proposed by Wand. The language’s utility for meta-programming is further demonstrated through additional representative examples. A prototype implementation is described and evaluated.  相似文献   

6.
We show that linear-time self-interpretation of the pure untyped lambda calculus is possible, in the sense that interpretation has a constant overhead compared to direct execution under various execution models. The present paper shows this result for reduction to weak head normal form under call-by-name, call-by-value and call-by-need.We use a self-interpreter based on previous work on self-interpretation and partial evaluation of the pure untyped lambda calculus.We use operational semantics to define each reduction strategy. For each of these we show a simulation lemma that states that each inference step in the evaluation of a term by the operational semantics is simulated by a sequence of steps in evaluation of the self-interpreter applied to the term (using the same operational semantics).By assigning costs to the inference rules in the operational semantics, we can compare the cost of normal evaluation and self-interpretation. Three different cost-measures are used: number of beta-reductions, cost of a substitution-based implementation (similar to graph reduction) and cost of an environment-based implementation.For call-by-need we use a non-deterministic semantics, which simplifies the proof considerably.  相似文献   

7.
We demonstrate that in the context of statically-typed purely-functional lambda calculi without recursion, unchecked exceptions (e.g., SML exceptions) can be strictly more powerful than call/cc. More precisely, we prove that a natural extension of the simply-typed lambda calculus with unchecked exceptions is strictly more powerful than all known sound extensions of Girard's F (a superset of the simply-typed lambda calculus) with call/cc.This result is established by showing that the first language is Turing complete while the later languages permit only a subset of the recursive functions to be written. We show that our natural extension of the simply-typed lambda calculus with unchecked exceptions is Turing complete by reducing the untyped lambda calculus to it by means of a novel method for simulating recursive types using unchecked-exception–returning functions. The result concerning extensions of F with call/cc stems from previous work of the author and Robert Harper.  相似文献   

8.
根据类型系统思想,为Intel/x86体系结构的机器语言重新定义了类型表达式并建立一套类型系统,机器语言代码虽然是一种无类型的二进制编码,但其类型信息被隐含在指令的操作语义中,利用建立在类型系统基础之上的类型推理算法可以静态地推理机器代码的安全性,由于所讨论的机器语言包含了跳转、函数调用和返回等主要指令,因此,这种静态检查方法可广泛应用于其他体系结构的低级语言代码的检查中。  相似文献   

9.
We describe how a set-theoretic foundation for mathematics can be encoded in the new system Scunak. We then discuss an encoding of the construction of functions as functional relations in untyped set theory. Using the dependent type theory of Scunak, we can define object level application and lambda abstraction operators (in the spirit of higher-order abstract syntax) mediating between functions in the (meta-level) type theory and (object-level) functional relations. The encoding has also been exported to Automath and Twelf.  相似文献   

10.
类型系统在分布式系统理论中有着非常重要的作用。在为π演算引入多态类型系统后,需要对新的环境下进程的等价关系进行研究。在多态类型系统下,环境只能得知进程中通道的抽象类型,而无法得知通道的具体类型,此时环境的区分能力被削弱,所得到的互模拟关系更为粗糙。本文在以往文献研究的基础上给出了多态π演算互模拟的一个公理系统,并证明了公理系统的一致性和完备性。  相似文献   

11.
This paper presents the first machine-checked verification of Milner's type inference algorithm W for computing the most general type of an untyped -term enriched with let-expressions. This term language is the core of most typed functional programming languages and is also known as Mini-ML. We show how to model all the concepts involved, in particular types and type schemes, substitutions, and the thorny issue of new variables. Only a few key proofs are discussed in detail. The theories and proofs are developed in Isabelle/HOL, the HOL instantiation of the generic theorem prover Isabelle.  相似文献   

12.
We define a computational model for a logical extension of Scheme, and give a metacircular evaluator for it. This minimal extension incorporates two new features only, i.e. logical variables and clause expressions, which can be used to define predicates in exactly the same way as lambda expressions can be used to define functions.Higher-order properties of Scheme are preserved: predicates can be passed to and returned from function applications. Predicate applications can appear as terms in functions. On the other hand, function applications can appear as terms in predicates, and can be formal as well as actual arguments, but only as long as they can be evaluated according to the usual Scheme semantics prohibiting access to unbound variables (except for constructor applications).  相似文献   

13.
This paper argues for the idea that in describing language we should follow Haskell Curry in distinguishing between the structure of an expression and its appearance or manifestation. It is explained how making this distinction obviates the need for directed types in type-theoretic grammars and a simple grammatical formalism is sketched in which representations at all levels are lambda terms. The lambda term representing the abstract structure of an expression is homomorphically translated to a lambda term representing its manifestation, but also to a lambda term representing its semantics.  相似文献   

14.
Realizability interpretations of logics are given by saying what it means for computational objects of some kind to realize logical formulae. The computational objects in question might be drawn from an untyped universe of computation, such as a partial combinatory algebra, or they might be typed objects such as terms of a PCF-style programming language. In some instances, one can show that a particular untyped realizability interpretation matches a particular typed one, in the sense that they give the same set of realizable formulae. In this case, we have a very good fit indeed between the typed language and the untyped realizability model — we refer to this condition as (constructive) logical full abstraction.We give some examples of this situation for a variety of extensions of PCF. Of particular interest are some models that are logically fully abstract for typed languages including non-functional features. Our results establish connections between what is computable in various programming languages and what is true inside various realizability toposes. We consider some examples of logical formulae to illustrate these ideas, in particular their application to exact real-number computability.  相似文献   

15.
The authors describe CTDNet, a data-driven reduction machine for the concurrent execution of applicative functional programs in the form of lambda calculus expressions. Such programs are stored as binary-tree-structured process graphs in which all processes maintain pointers to their immediate neighbors (i.e. ancestor and two children). Processes are of two basic types: master processes, which represent the original process graph, and slave processes, which carry out the actual executional work and are dynamically created and destroyed. CTDNet uses a distributed eager evaluation scheme with a modification to evaluate conditional expressions lazily, together with a form of distributed string reduction with some graphlike modifications  相似文献   

16.
Higher-order representations of objects such as programs, proofs, formulas, and types have become important to many symbolic computation tasks. Systems that support such representations usually depend on the implementation of an intensional view of the terms of some variant of the typed lambda calculus. New notations have been proposed for the lambda calculus that provide an excellent basis for realizing such implementations. There are, however, several choices in the actual deployment of these notations the practical consequences of which are not currently well understood. We attempt to develop such an understanding here by examining the impact on performance of different combinations of the features afforded by such notations. Among the facets examined are the treatment of bound variables, eagerness and laziness in substitution and reduction, the ability to merge different structure traversals into one, and the virtues of annotations on terms that indicate their dependence on variables bound by external abstractions. We complement qualitative assessments with experiments conducted by executing programs in a language that supports an intensional view of lambda terms while varying relevant aspects of the implementation of the language. Our study provides insights into the preferred approaches to representing and reducing lambda terms and also exposes characteristics of computations that have a somewhat unanticipated effect on performance.  相似文献   

17.
The LR parser generators that are bundled with many functional programming language implementations produce code that is untyped, needlessly inefficient, or both. We show that, using generalized algebraic data types, it is possible to produce parsers that are well-typed (so they cannot unexpectedly crash or fail) and nevertheless efficient. This is a pleasing result as well as an illustration of the new expressiveness offered by generalized algebraic data types.  相似文献   

18.
The general concern of the Jacopini technique is the question: “Is it consistent to extend a given lambda calculus with certain equations?” The technique was introduced by Jacopini in 1975 in his proof that in the untyped lambda calculusΩis easy, i.e.,Ωcan be assumed equal to any other (closed) term without violating the consistency of the lambda calculus. The presentations of the Jacopini technique that are known from the literature are difficult to understand and hard to generalise. In this paper we generalise the Jacopini technique for arbitrary lambda calculi. We introduce the concept ofproof-replaceabilityby which the structure of the technique is simplified considerably. We illustrate the simplicity and generality of our formulation of the technique with some examples. We apply the Jacopini technique to theλμ-calculus, and we prove a general theorem concerning the consistency of extensions of theλμ-calculus of a certain form. Many well known examples (e.g., the easiness ofΩ) are immediate consequences of this general theorem.  相似文献   

19.
20.
In this paper is presented an algorithm for constructing natural deduction proofs in the propositional intuitionistic and classical logics according to the analogy relating intuitionistic propositional formulas and natural deduction proofs, respectively, to types and terms of simple type theory. Proofs are constructed as closed terms in the simple typed calculus. The soundness and completeness of this method are proved.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号