首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In an attempt to accommodate natural language phenomena involving nominalization and self-application, various researchers in formal semantics have proposed abandoning the hierarchical type system which Montague inherited from Russell, in favour of more flexible type regimes. We briefly review the main extant proposals, and then develop a new approach, based semantically on Aczel's notion of Frege structure, which implements a version ofsubsumption polymorphism. Nominalization is achieved by virtue of the fact that the types of predicative and propositional complements are contained in the type of individuals. Russell's paradox is avoided by placing a type-constraint on lambda-abstraction, rather than by restricting comprehension.Kamareddine is grateful to the Department of Mathematics and Computing Science, Technical University of Eindhoven, for their financial support and hospitality during the academic year 1991–92.Klein's work has been carried out as part of the research programmes of the dyana project (BR 3175 and BR 6852), funded by CEC esprit Basic Research Division, and of the Human Communication Research Centre, supported by the UK Economic and Social Research Council.  相似文献   

2.
Meyer and Wand established that the type of a term in the simply typed -calculus may be related in a straightforward manner to the type of its call-by-value CPS transform. This typing property may be extended to Scheme-like continuation-passing primitives, from which the soundness of these extensions follows. We study the extension of these results to the Damas-Milner polymorphic type assignment system under both the call-by-value and call-by-name interpretations. We obtain CPS transforms for the call-by-value interpretation, provided that the polymorphic let is restricted to values. and for the call-by-name interpretation with no restrictions. We prove that there is no call-by-value CPS transform for the full Damas-Milner language that validates the Meyer-Wand typing property and is equivalent to the standard call-by-value transform up to operational equivalence.This is a revised version of a paper presented at the ACM SIGPLAN Workshop on Continuations, San Francisco, June 1992.This work was sponsored by the Defense Advanced Research Projects Agency, CSTO, under the title The Fox Project: Advanced Development of Systems Software, ARPA Order No. 8313, issued by ESD/AVS under Contract No. F19628-91-C-0168. Supported by a National Science Foundation Graduate Fellowship.  相似文献   

3.
Parameteric overloading refers to the combination of parameteric polymorphism and overloading of polymorphic operators. The formal basis for parametric overloading, proposed by Kaes and extended by Wadler and Blott, is based on type predicates. In this paper another approach to type-checking for parameteric overloading is proposed. The resulting type system loosens some of the restrictions required of overload instance types for type-checking, while also providing fresh insight into type-checking for parameteric overloading. In this system, the kind for a type variable characterizes the set of closed type expressions which may be substituted for that variable. A theory of equality and subkinding for this system is presented, and algorithms for emptiness-checking, subkinding and intersection are provided. This kind system is used as the basis for an extension of Milner's W algorithm for ML-style type inference to kinded type inference. Finally the kinded type system is verified to be sound and complete with respect to the system of type predicates proposed by Wadler and Blott. Received: 11 May 1993 / 28 November 1994  相似文献   

4.
The Spin model checker and its specification language Promela have been used extensively in industry and academia to check the logical properties of distributed algorithms and protocols. Model checking with Spin involves reasoning about a system via an abstract Promela specification, thus the technique depends critically on the soundness of this specification. Promela includes a rich set of data types including first-class channels, but the language syntax restricts the declaration of channel types so that it is not generally possible to deduce the complete type of a channel directly from its declaration. We present the design and implementation of Etch, an enhanced type checker for Promela, which uses constraint-based type inference to perform strong type checking of Promela specifications, allowing static detection of errors that Spin would not detect until simulation/verification time, or that Spin may miss completely. We discuss theoretical and practical problems associated with designing a type system and type checker for an existing language, and formalise our approach using a Promela-like calculus. To handle subtyping between base types, we present an extension to a standard unification algorithm to solve a system of equality and subtyping constraints, based on bounded substitutions.  相似文献   

5.
Lisp applications need to show a reasonable cost-benefit relationship between the offered expressiveness and their demand for storage and run-time. Drawbacks in efficiency, apparent inLisp as a dynamically typed programming language, can be avoided by optimizations. Statically inferred type information can be decisive for the success of these optimizations.This paper describes a practical approach to type inference realized in a module and application compiler forEuLisp. The approach is partly related to Milner-style polymorphic type inference, but differs by describing functions withgeneric type schemes. Dependencies between argument and result types can be expressed more precisely by using generic type schemes of several lines than by using the common one-line type schemes. Generic type schemes contain types of a refined complementary lattice and bounded type variables. Besides standard and defined types so-calledstrategic types (e.g. singleton, zero, number-list) are combined into the type lattice. Local, global and control flow inference using generic type schemes with refined types generate precise typings of defined functions. Due to module compilation, inferred type schemes of exported functions can be stored in export interfaces, so they may be reused when imported elsewhere.This work was supported by the German Federal Ministry for Research and Technology (BMFT) within the joint project APPLY. The partners in this project are the Christian Albrechts University Kiel, the Fraunhofer Institute for Software Engineering and Systems Engineering (ISST), the German National Research Centre for Computer Science (GMD), and VW-GEDAS.  相似文献   

6.
We study here Type Inference and Type Checking for queries over the execution traces of Business Processes. We define formal models for such execution traces, allowing to capture various realistic scenarios of partial information about these traces. We then define corresponding notions of types, and the problems of type inference and type checking in this context. We further provide a comprehensive study of the decidability and complexity of these problems, in various cases, and suggest efficient algorithms where possible.  相似文献   

7.
We present a graph grammar based type inference system for a totally graphic development language. NiMo (Nets in Motion) can be seen as a graphic equivalent to Haskell that acts as an on-line tracer and debugger. Programs are process networks that evolve giving total visibility of the execution state, and can be interactively completed, changed or stored at any step. In such a context, type inference must be incremental. During the net construction or modification only type safe connections are allowed. The user visualizes the type information evolution and, in case of conflict, can easily identify the causes. Though based on the same ideas, the type inference system has significant differences with its analogues in functional languages. Process types are a non-trivial generalization of functional types to handle multiple outputs and deferred arguments even in higher order parameters, partial application in any order and curried-uncurried coercion. Here we present the elements to model graphical inference, the notion of structural and non-structural equivalence of type graphs, and a graph unification and composition calculus for typing nets in an incremental way.  相似文献   

8.
Domain experts typically have detailed knowledge of the concepts that are used in their domain; however they often lack the technical skills needed to translate that knowledge into model-driven engineering (MDE) idioms and technologies. Flexible or bottom-up modelling has been introduced to assist with the involvement of domain experts by promoting the use of simple drawing tools. In traditional MDE the engineering process starts with the definition of a metamodel which is used for the instantiation of models. In bottom-up MDE example models are defined at the beginning, letting the domain experts and language engineers focus on expressing the concepts rather than spending time on technical details of the metamodelling infrastructure. The metamodel is then created manually or inferred automatically. The flexibility that bottom-up MDE offers comes with the cost of having nodes in the example models left untyped. As a result, concepts that might be important for the definition of the domain will be ignored while the example models cannot be adequately re-used in future iterations of the language definition process. In this paper, we propose a novel approach that assists in the inference of the types of untyped model elements using Constraint Programming. We evaluate the proposed approach in a number of example models to identify the performance of the prediction mechanism and the benefits it offers. The reduction in the effort needed to complete the missing types reaches up to 91.45% compared to the scenario where the language engineers had to identify and complete the types without guidance.  相似文献   

9.
Fuzzy inference, a data processing method based on the fuzzy theory that has found wide use in the control field, is reviewed. Consumer electronics, which accounts for most current applications of this concept, does not require very high speeds. Although software running on a conventional microprocessor can perform these inferences, high-speed control applications require much greater speeds. A fuzzy inference date processor that operates at 200000 fuzzy logic inferences per second and features 12-b input and 16-b output resolution is described  相似文献   

10.
We present a generalized let-polymorphic type inference algorithm, prove that any of its instances is sound and complete with respect to the Hindley/Milner let-polymorphic type system, and find a condition on two instance algorithms so that one algorithm should find type errors earlier than the other. By instantiating the generalized algorithm with different parameters, we can obtain not only the two opposite algorithms (the bottom-up standard algorithmW and the top-down algorithmM) but also other hybrid algorithms which are used in real compilers. Such instances’ soudness and completeness follow automatically, and their relative earliness in detecting type-errors is determined by checking a simple condition. The set of instances of the generalized algorithm is a superset of those used in the two most popular ML compilers: SML/NJ and OCaml. This work is supported by Creative Research Initiatives of the Korean Ministry of Science and Technology National Creative Research Initiative Center, http://ropas.kaist.ac.kr Work done while the third author was associated with Korea Advanced Institute of Science and Technology Hyunjun Eo: He is a Ph.D. candidate in the Department of Computer Science at KAIST (Korea Advanced Institute of Science and Technology). He recieved his bachelor’s degree and master’s degree in Computer Science from KAIST in 1996 and 1998, respectively. His research interest has been on static program analysis, fixpoint iteration algorithm and higher-order and typed languages. From fall 1998, he has been a research assistant of the National Creative Research Initiative Center for Research on Program Analysis System. He is currently working on developing a tool for automatic generation of program analyzer. Oukseh Lee: He is a Ph.D. candidate in the Department of Computer Science at KAIST (Korea Advanced Institute of Science and Technology). He received his bachelor’s and master’s degree in Computer Science from KAIST in 1995 and 1997, respectively. His research interest has been on static program analysis, type system, program language implementation, higher-order and typed languages, and program verification. From 1998, he has been a research assistant of the National Creative Research Initiative Center for Research on Program Analysis System. He is currently working on compile-time analyses and verification for the memory behavior of programs. Kwangkeun Yi, Ph.D.: His research interest has been on semanticbased program analysis and systems application of language technologies. After his Ph.D. from University of Illinois at Urbana-Champaign he joined the Software Principles Research Department at Bell Laboratories, where he worked on various static analysis approaches for higher-order and typed programming languages. For 1995 to 2003 he was a faculty member in the Department of Computer Science, Korea Advanced Institute of Science and Technology. Since fall 2003, he has been a faculty member in the School of Computer Science and Engineering, Seoul National University.  相似文献   

11.
We present a new propositional calculus that has desirable natures with respect to both automatic reasoning and computational complexity: we introduce an inference rule, called permutation, into a cut-free Gentzen type propositional calculus. It allows us to obtain a system which (1) guarantees the subformula property and (2) has polynomial size proofs for hard combinatorial problems, such as pigeonhole principles. We also discuss the relative efficiency of our system. Frege systems polynomially prove the partial consistency of our system.  相似文献   

12.
The aim of this article is to reconsider the idea of human intelligence and machine intelligence as two separate entities. To do so, we defined a new concept that we call polymorphic intelligence.1 Such a concept comes up as a possible answer to many “false” paradigms and philosophical and conceptual orientations that for decades have pervaded many research fields, such as education, art, literature, psychology, pedagogy, science, technology, and AI. We believe, indeed, that in this exact moment of human history, it becomes necessary to clarify with a strong theoretical paradigm what is the real relationship between machines and humans. Therefore, we propose to abandon the mental scheme by which intelligence is an exclusive prerogative of humans in order to embrace the idea that machines have started to express a real collaborative and/or competitive force, that they are able to produce ideation and inspiration, and to contribute to the wealth of ideas. This work was presented in part at the 12th International Symposium on Artificial Life and Robotics, Oita, Japan, January 25–27, 2007  相似文献   

13.
The organization of parallel inference in dynamic decision support systems (DDSS) of a semiotic type, oriented towards a solving of ill-formed problems in dynamic applied domains, is considered. As a knowledge representation model, there are used production rules reflecting expert knowledge about a problem domain, an environment and decision making processes. The main concepts and assertions defining possibility and impossibility of parallel executing the production rules are given. Several types of parallelism in an inference process are introduced. The corresponding algorithm of parallel inference is described. Thus, the purpose of this paper is to develop and to research parallel inference methods and procedures that provide efficient processing a large amount of production rules for DDSS of a semiotic type.  相似文献   

14.
15.
Defunctionalization is a program transformation that eliminates functions as first-class values. We show that defunctionalization can be viewed as a type-preserving transformation of an extension of F with guarded algebraic data types into itself. We also suggest that defunctionalization is an instance of concretization, a more general technique that allows eliminating constructs other than functions. We illustrate this point by presenting two new type-preserving transformations that can be viewed as instances of concretization. One eliminates Rémy-style polymorphic records; the other eliminates the dictionary records introduced by the standard compilation scheme for Haskell’s type classes. This is a revised and extended version of an earlier conference paper[30].  相似文献   

16.
17.
Along with the explosive increase of product data management systems (PDMs),integrating polymorphic PDMs is becoming one of the focuses in the field. Aner brief describing several features of PDM, and market requirements and required performance for integrating the polymorphic PDMs, the paper specializes in discussing some key technologies involved in virtual enterprise oriented integrated development. Some implementing strategies and procedures are proposed.  相似文献   

18.
Program models allowing for polymorphism are proposed. The emphasis is on the functional-semantic aspect of polymorhpism. The relationship of the polymorphic models with other more abstract program models is examined.Translated from Kibernetika, No. 1, pp. 29–34, January–February, 1991.  相似文献   

19.
Collaborative Internet worm containment   总被引:2,自引:0,他引:2  
Large-scale worm outbreaks that lead to distributed denial-of-service attacks pose a major threat to Internet infrastructure security. Fast worm containment is crucial for minimizing damage and preventing flooding attacks against network hosts.  相似文献   

20.
Nowadays, location-related information is highly accessible to mobile users via issuing Location-Dependent Spatial Queries (LDSQs) with respect to their locations wirelessly to Location-Based Service (LBS) servers. Due to the limited mobile device battery energy, scarce wireless bandwidth, and heavy LBS server workload, the number of LDSQs submitted over wireless channels to LBS servers for evaluation should be minimized as appropriate. In this paper, we exploit query containment techniques for LDSQs (called LDSQ containment) to enable mobile clients to determine whether the result of a new LDSQ Q′ is completely covered by that of another LDSQ Q previously answered by a server (denoted by Q′ ⊆ Q) and to answer Q′ locally if Q′ ⊆ Q. Thus, many LDSQs can be reduced from server evaluation. To support LDSQ containment, we propose a notion of containment scope, which represents a spatial area corresponding to an LDSQ result wherein all semantically matched LDSQs are answerable with the result. Through a comprehensive simulation, our proposed approach significantly outperforms existing techniques.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号