首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We claim that a continuation style semantics of a programming language can provide a starting point for constructing its proof system. The basic idea is to see weakest preconditions as a particular instance of continuation style semantics, hence to interpret correctness assertions (e.g. Hoare triples {p} C {r}) as inequalities over continuations. This approach also shows a correspondence between labels in a program and annotations. Received July 1997 / Accepted in revised form August 1999  相似文献   

2.
When programs are intended for parallel execution it becomes critical to determine whether the evaluations of two expressions can be carried out independently. We provide a scheme for making such determinations in a typed language with higher-order constructs and imperative features. The heart of our scheme is a mechanism for estimating thesupport of an expression, i.e., the set of global variables involved in its evaluation. This computation requires knowledge of all the aliases of an expression. The inference schemes are presented in a compositional fashion reminiscent of abstract interpretation. We prove the soundness of our estimates with respect to the standard semantics of the language.Supported by National Science Foundation Grant DCR-8602072.  相似文献   

3.
We develop a denotational semantics for POOL, a parallel object-oriented programming language. The main contribution of this semantics is an accurate mathematical model of the most important concept in object-oriented programming: the object. This is achieved by structuring the semantics in layers working at three different levels: for statements, objects and programs. For each of these levels we define a specialized mathematical domain of processes, which we use to assign a meaning to each language construct. This is done in the mathematical framework of complete metric spaces. We also define operators that translate between these domains. At the program level we give a precise definition of the observable input/output behaviour of a particular program, which could be used at a later stage to decide the issue of full abstractness. We illustrate our semantic techniques by first applying them to a toy language similar to CSP.This paper describes work done in ESPRIT Basic Research Action 3020,Integration.  相似文献   

4.
We present a theoretical basis for supporting subjective and conditional probabilities in deductive databases. We design a language that allows a user greater expressive power than classical logic programming. In particular, a user can express the fact thatA is possible (i.e.A has non-zero probability),B is possible, but (A B) as a whole is impossible. A user can also freely specify probability annotations that may contain variables. The focus of this paper is to study the semantics of programs written in such a language in relation to probability theory. Our model theory which is founded on the classical one captures the uncertainty described in a probabilistic program at the level of Herbrand interpretations. Furthermore, we develop a fixpoint theory and a proof procedure for such programs and present soundness and completeness results. Finally we characterize the relationships between probability theory and the fixpoint, model, and proof theory of our programs.  相似文献   

5.
We describe Chisel, a tool that synthesizes a program slicer directly from a given algebraic specification of a programming language operational semantics \(\mathcal {S}\). \(\mathcal {S}\) is assumed to be a rewriting logic specification, given in Maude, while the program is a ground term of this specification. Chisel takes \(\mathcal {S}\) and synthesizes language constructs, i.e., instructions, that produce features relevant for slicing, e.g., data dependency. We implement syntheses adjusted to each feature as model checking properties over an abstract representation of \(\mathcal {S}\). The syntheses results are used by a traditional interprocedural slicing algorithm that we parameterize by the synthesized language features. We present the tool on two language paradigms: high-level, imperative and low-level, assembly languages. Computing program slices for these languages allows for extracting traceability properties in standard compilation chains and makes our tool fitting for the validation of embedded system designs. Chisel’s slicing benchmark evaluation is based on benchmarks used in avionics.  相似文献   

6.
Determining points-to sets is an important static-analysis problem. Most of the classic static analyses (used e.g., by compilers or in programming environments) rely on knowing which variables might be used or defined by each expression in a program. In the presence of pointers, the use/def set of an expression like *p = *q can only be determined given (safe) points-to sets for p and q. Previous work has shown that both precise flow-sensitive and precise flow-insensitive pointer analysis is NP-Hard, even when restricted to single-procedure programs with no dynamic memory allocation. In this paper, we show that it is not even possible to compute good approximations to the precise solutions (i.e., to compute points-to sets whose sizes are within a constant factor of the sizes of the precise points-to sets) unless P=NP. Received: 1 November 2001 / 4 February 2002  相似文献   

7.
Preference logic programming (PLP) is an extension of logic programming for declaratively specifying problems requiring optimization or comparison and selection among alternative solutions to a query. PLP essentially separates the programming of a problem itself from the criteria specification of its solution selection. In this paper we present a declarative method for specifying preference logic programs. The method introduces a precise formalization for the syntax and semantics of PLP. The syntax of a preference logic program contains two disjoint sets of definite clauses, separating a core program specifying a general computational problem from its preference rules for optimization; the semantics of PLP is given based on the Herbrand model and fixed point theory, where how preferences affects the least Herbrand model of a logic program is interpreted as a sequence of meta-level mapping operations. In addition, we present an operational semantics based on a new resolution strategy and a memoized recursive algorithm for computing strictly stratified logic programs with well-formed preferences, and further show that the operational semantics of such a preference logic program is consistent to its declarative semantics.  相似文献   

8.
A refinement calculus for the development of real-time systems is presented. The calculus is based upon a wide-spectrum language called TAM (the Temporal Agent Model), within which both functional and timing properties can be expressed in either abstract or concrete terms. A specification oriented semantics is given for the language. Program development is considered as a refinement process i.e. thecalculation of a structured program from an unstructured specification. An example program is developed.  相似文献   

9.
A formal model of analogy is introduced in the logic programming setting, and an analogical reasoning program (called DIANA, i.e. Declarative Inference by ANAlogy) is developed in accordance with precise procedural and declarative semantics. Given the source and target domains of analogy as two logic programsP s andP t , together with a specificationS of the analogical correspondence between predicate symbols, atoms involving these symbols are analogically derived fromP=P s P t givenS, which are not derivable fromP s orP t orP s P t alone. In this paper, the requirements of the analogical process are first stated. The declarative semantics of analogy is then given, by defining the least analogical model ofP as an extension of the classical semantics of Horn clauses. A procedural semantics is also described, in terms of an extension of SLD resolution. Both semantics rely on implicit analogical axioms defining the kind of analogical reasoning envisaged. The implementation of DIANA has been done in Reflective Prolog, a metalogic programming language previously developed by the first two authors. It is shown that analogical axioms can be viewed as an instance of reflection axioms used in Reflective Prolog. By exploiting this feature, the implementation of DIANA is argued to be sound w.r.t. the defined semantics. Examples of analogical reasoning in DIANA are also described. By comparison with the AI literature on analogy, it is claimed that this is the first approach which gives a declarative semantics to analogical reasoning, thanks to the possibility of carrying over in this field the basic logic programming concepts.  相似文献   

10.
We present the preliminary design of a programming model for building reliable systems with distributed state from collections of potentially unreliable components. Our transactor model provides constructs for maintaining consistency among the states of distributed components. Our intention is that transactors should support key aspects of both traditional distributed transactions, e.g., for electronic commerce, and systems with weaker consistency requirements, e.g., peer-to-peer file- and process-sharing systems. In this paper, we motivate the need for language support for maintenance of distributed state, describe the design goals for the transactor model, provide an operational semantics for a simple transactor calculus, and provide several examples of applications of the transactor model in a higher-level language.The authors would like to thank James Leifer for detailed comments on previous drafts of this paper, and the anonymous FOCLASA referees for helpful feedback.  相似文献   

11.
In a previous paper (Blair et al. 2001), the authors showed that the mechanism underlying Logic Programming can be extended to handle the situation where the atoms are interpreted as subsets of a given space X. The view of a logic program as a one-step consequence operator along with the concepts of supported and stable model can be transferred to such situations. In this paper, we show that we can further extend this paradigm by creating a new one-step consequence operator by composing the old one-step consequence operator with a monotonic idempotent operator (miop) in the space of all subsets of X, 2 X . We call this extension set based logic programming. We show that such a set based formalism for logic programming naturally supports a variety of options. For example, if the underlying space has a topology, one can insist that the new one-step consequence operator always produces a closed set or always produces an open set. The flexibility inherent in the semantics of set based logic programs is due to both the range of natural choices available for specifying the semantics of negation, as well as the role of monotonic idempotent operators (miops) as parameters in the semantics. This leads to a natural type of polymorphism for logic programming, i.e. the same logic program can produce a variety of outcomes depending on the miop associated with the semantics. We develop a general framework for set based programming involving miops. Among the applications, we obtain integer-based representations of real continuous functions as stable models of a set based logic program.   相似文献   

12.
The paper studies connections between denotational and operational semantics for a simple programming language based on LCF. It begins with the connection between the behaviour of a program and its denotation. It turns out that a program denotes ⊥ in any of several possible semantics if it does not terminate. From this it follows that if two terms have the same denotation in one of these semantics, they have the same behaviour in all contexts. The converse fails for all the semantics. If, however, the language is extended to allow certain parallel facilities behavioural equivalence does coincide with denotational equivalence in one of the semantics considered, which may therefore be called “fully abstract”. Next a connection is given which actually determines the semantics up to isomorphism from the behaviour alone. Conversely, by allowing further parallel facilities, every r.e. element of the fully abstract semantics becomes definable, thus characterising the programming language, up to interdefinability, from the set of r.e. elements of the domains of the semantics.  相似文献   

13.
Augmenting a conceptual model with geospatiotemporal annotations   总被引:1,自引:0,他引:1  
While many real-world applications need to organize data based on space (e.g., geology, geomarketing, environmental modeling) and/or time (e.g., accounting, inventory management, personnel management), existing conventional conceptual models do not provide a straightforward mechanism to explicitly capture the associated spatial and temporal semantics. As a result, it is left to database designers to discover, design, and implement - on an ad hoc basis - the temporal and spatial concepts that they need. We propose an annotation-based approach that allows a database designer to focus first on nontemporal and nongeospatial aspects (i.e., "what") of the application and, subsequently, augment the conceptual schema with geospatiotemporal annotations (i.e., "when" and "where"). Via annotations, we enable a supplementary level of abstraction that succinctly encapsulates the geospatiotemporal data semantics and naturally extends the semantics of a conventional conceptual model. An overarching assumption in conceptual modeling has always been that expressiveness and formality need to be balanced with simplicity. We posit that our formally defined annotation-based approach is net only expressive, but also straightforward to understand and implement.  相似文献   

14.
Static analyses based on denotational semantics can naturally model functional behaviours of the code in a compositional and completely context and flow sensitive way. But they only model the functional i.e., input/output behaviour of a program P, not enough if one needs P’s internal behaviours i.e., from the input to some internal program points. This is, however, a frequent requirement for a useful static analysis. In this paper, we overcome this limitation, for the case of mono-threaded Java bytecode, with a technique used up to now for logic programs only. Namely, we define a program transformation that adds new magic blocks of code to the program P, whose functional behaviours are the internal behaviours of P. We prove the transformation correct w.r.t. an operational semantics and define an equivalent denotational semantics, devised for abstract interpretation, whose denotations for the magic blocks are hence the internal behaviours of P. We implement our transformation and instantiate it with abstract domains modelling sharing of two variables, non-cyclicity of variables, nullness of variables, class initialisation information and size of the values bound to program variables. We get a static analyser for full mono-threaded Java bytecode that is faster and scales better than another operational pair-sharing analyser. It has the same speed but is more precise than a constraint-based nullness analyser. It makes a polyhedral size analysis of Java bytecode scale up to 1300 methods in a couple of minutes and a zone-based size analysis scale to still larger applications.  相似文献   

15.
Obliq is a lexically scoped, distributed, object-based programming language. In Obliq, the migration of an object is proposed as creating a clone of the object at the target site, whereafter the original object is turned into an alias for the clone. Obliq has only an informal semantics, so there is no proof that this style of migration is safe, i.e., transparent to object clients. In previous work, we introduced Ø, an abstraction of Obliq, where, by lexical scoping, sites have been abstracted away. We used Ø in order to exhibit how the semantics behind Obliq's implementation renders migration unsafe. We also suggested a modified semantics that we conjectured instead to be safe. In this paper, we rewrite our modified semantics of Ø in terms of the π-calculus, and we use it to formally prove the correctness of object surrogation, the abstraction of object migration in Ø.  相似文献   

16.
Non-deterministic data types: models and implementations   总被引:2,自引:0,他引:2  
Summary The model theoretic basis for (abstract) data types is generalized from algebras to multi-algebr as in order to cope with non-deterministic operations. A programming oriented definition and a model theoretic criterion (called simulation) for implementation of data types are given. To justify the criterion w.r.t. the definition, an abstract framework linking denotational semantics of programming languages and model theory of data types is set up. A set of constraints on a programming language semantics are derived which guarantee that simulation implies implementation. It is argued that any language supporting data abstraction does fulfill these constraints. As an example a simple but expressive language L is defined and it is formally proved that L does conform to these restrictions.  相似文献   

17.
TheTexture Synthesis Language (TSL) is a new high-level graphics language which provides tools for defining and generating regular and random (irregular) synthetic textures. The textures are used to fill in planar regions or can be mapped onto other surfaces. The building block for generating textures is a texture tile, i.e., a rectangular matrix oftexels (texture elements). The programmer constructs texture tiles utilizing predefined constant tiles, user-defined tiles, and texel-based operations. Tiles can be transformed and combined in various ways, and can then be used to tessellate planar polygons.This work was partially supported by the National Science Foundation under grant DCR-86-03603  相似文献   

18.
In this paper, we propose a semantic framework to debug synchronous message passing-based con- current programs, which are increasingly useful as parallel computing and distributed systems become more and more pervasive. We first design a concurrent programming language model to uniformly represent exist- ing concurrent programming languages. Compared to sequential programming languages, this model contains communication statements, i.e., sending and receiving statements, and a concurrent structure to represent com- munication and concurrency. We then propose a debugging process consisting of a tracing and a locating procedure. The tracing procedure re-executes a program with a failed test case and uses specially designed data structures to collect useful execution information for locating bugs. We provide for the tracing procedure a struc- tural operational semantics to represent synchronous communication and concurrency. The locating procedure backward locates the ill-designed statement by using information obtained in the tracing procedure, generates a fix equation, and tries to fix the bug by solving the fix equation. We also propose a structural operational semantics for the locating procedure. We supply two examples to test our proposed operational semantics.  相似文献   

19.
This paper addresses complexity issues for important problems arising with disjunctive logic programming. In particular, the complexity of deciding whether a disjunctive logic program is consistent is investigated for a variety of well-known semantics, as well as the complexity of deciding whether a propositional formula is satisfied by all models according to a given semantics. We concentrate on finite propositional disjunctive programs with as well as without integrity constraints, i.e., clauses with empty heads; the problems are located in appropriate slots of the polynomial hierarchy. In particular, we show that the consistency check is 2 p -complete for the disjunctive stable model semantics (in the total as well as partial version), the iterated closed world assumption, and the perfect model semantics, and we show that the inference problem for these semantics is 2 p -complete; analogous results are derived for the answer sets semantics of extended disjunctive logic programs. Besides, we generalize previously derived complexity results for the generalized closed world assumption and other more sophisticated variants of the closed world assumption. Furthermore, we use the close ties between the logic programming framework and other nonmonotonic formalisms to provide new complexity results for disjunctive default theories and disjunctive autoepistemic literal theories.Parts of the results in this paper appeared in form of an abstract in the Proceedings of the Twelfth ACM SIGACT SIGMOD-SIGART Symposium on Principles of Database Systems (PODS-93), pp. 158–167. Other parts appeared in shortened form in the Proceedings of the International Logic Programming Symposium, Vancouver, October 1993 (ILPS-93), pp. 266–278. MIT Press.  相似文献   

20.
Semantics of EqL     
The formal semantics of a novel language, called EqL, are presented for first-order functional and Horn logic programming. An EqL program is a set of conditional pattern-directed rules, where the conditions are expressed as a conjunction of equations. The programming paradigm provided by this language may be called equational programming. The declarative semantics of equations is given in terms of their complete set of solutions, and the operational semantics for solving equations is an extension of reduction, called object refinement. The correctness of the operational semantics is established through the soundness and completeness theorems. Examples are given to illustrate the language and its semantics.<>  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号