首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
The semantics of PROLOG programs is usually given in terms of the model theory of first-order logic. However, this does not adequately characterizethe computational behavior of PROLOG programs. PROLOG implementations typically use a sequential evaluation strategy based on the textual order of clauses and literals in a program, as well as nonlogical features like cut. In this work we develop a denotational semantics that captures thecomputational behavior of PROLOG. We present a semantics for “cut-free” PROLOG, which is then extended to PROLOG with cut. For each case we develop a congruence proof that relates the semantics to a standard operational interpreter. As an application of our denotational semantics, we show the correctness of some standard “folk” theorems regarding transformations on PROLOG programs.  相似文献   

3.
The relationship between programs and the set of partial correctness assertions that they satisfy, constitutes a Galois connection. The topology resulting from this Galois connection is closely related to the Lindenbaum baum topology for the language in which these partial correctness assertions are stated. This relationship provides us with a tool for understanding the incompleteness of Hoare Logics and for answering certain natural questions about the connection between the relational semantics and the partial correctness assertion semantics for programs, especially in connection with the question of modularity of programs. Two questions to which we shall find topological answers in this paper are “When is a language expressive for a program?”, and “When can we have rules of inference which are adequate to infer the properties of the complex program ±#β from those of its components ±,β?” We also obtain a natural answer to the question “What can the set{(A, B)|{A}±{B} is true) look like for arbitraryα?”.  相似文献   

4.
A generalized problem is defined in terms of functions on sets and illustrated in terms of the computational geometry of simple planar polygons. Although its apparent time complexity is O(n 2), the problem is shown to be solvable for several cases of interest (maximum and minimum distance, intersection detection and rerporting) in O(n logn), O(n) or O(logn) time, depending on the nature of a specialized “selection” function. (Some of the cases can also be solved by the Voronoi diagram method; but time complexity increases with that approach.) A new use of monotonicity and a new concept of “locality” in set mappings contribute significantly to the derivation of the results.  相似文献   

5.
The use of sets in declarative programming has been advocated by several authors in the literature. A representation often chosen for finite sets is that of scons, parallel to the list constructor cons. The logical theory for such constructors is usually tacitly assumed to be some formal system of classical set theory. However, classical set theory is formulated for a general setting, dealing with both finite and infinite sets, and not making any assumptions about particular set constructors. In giving logical-consequence semantics for programs with finite sets, it is important to know exactly what connection exists between sets and set constructors. The main contribution of this paper lies in establishing these connections rigorously. We give a formal system, called SetAx, designed around the scons constructor. We distinguish between two kinds of set constructors, scons(x, y) and dscons(x, y), where both represent {x} ∪ y, but x ϵ y is possible in the former, while xy holds in the latter. Both constructors find natural uses in specifying sets in logic programs. The design of SetAx is guided by our choice of scons as a primitive symbol of our theory rather than as a defined one, and by the need to deduce non-membership relations between terms, to enable the use of dscons. After giving the axioms SetAx, we justify it as a suitable theory for finite sets in logic programming by (i) showing that the set constructors indeed behave like finite sets; (ii) providing a framework for establishing the correctness of set unification; and (iii) defining a Herbrand structure and providing a basis for discussing logical consequence semantics for logic programs with finite sets. Together, these results provide a rigorous foundation for the set constructors in the context of logical semantics.  相似文献   

6.
《Computers & Graphics》1987,11(2):121-140
Because of its convenience and growing supporting technology, a form of raster graphics called bit-mapped graphics is becoming increasingly pervasive. A typical bit-mapped graphics environment supports the efficient manipulation of high-resolution images over a small intensity space. As such, a unique set of bit-map concepts and operations has evolved. This paper argues that a rigorous consideration of bit-mapped graphics is important and useful. A semantics of bit-mapped graphics is presented, which carefully distinguishes among scenes, bit-maps, and images. The semantic framework is then used to prove some interesting results regarding the 2D rendering, or rasterisation, of scenes into images. We also introduce the useful notion of “stable” lines, and consider the conditions under which a geometric transformation is “faithful” to a bit-map. Apart from their intrinsic interest, the results reported here constitute a first step towards arriving at a definitive understanding of the relationship between bit-mapped graphics and other computer graphics technologies.  相似文献   

7.
Consider the connection between denotational semantics for a language with goto statements and flow diagrams for programs in such a language. The main point of interest is that the denotational semantics uses a recursively defined environment to give the meaning of labels, while a flow diagram merely has a jump to the appropriate program point. A simple reduction called “indirection elimination” strips away the environment from the denotational semantics and extracts an expression with cycles that is very close to the flow diagram of a program. The same idea applies to associating bodies with recursive procedures, or to any construct whose semantics is not wedded to the syntax. In addition to being a useful data structure and conceptual device, expressions with cycles are well defined mathematical objects—their semantics can be given by unfolding them into infinite structures that have been well studied. The practicality of the elimination of environments has been tested by constructing a trial implementation, which serves as the front end of a semantics directed compiler generator. The implementation takes a denotational semantics of a language and constructs a “black box” that maps programs in the language into an intermediate representation. The intermediate representation is a circular expression.  相似文献   

8.
We are considering knowledge discovery from data describing a piece of real or abstract world. The patterns being induced put in evidence some laws hidden in the data. The most natural representation of patterns-laws is by “if..., then...” decision rules relating some conditions with some decisions. The same representation of patterns is used in multi-attribute classification, thus the data searched for discovery of these patterns can be seen as classification data. We adopt the classification perspective to present an original methodology of inducing general laws from data and representing them by so-called monotonic decision rules. Monotonicity concerns relationships between values of condition and decision attributes, e.g. the greater the mass (condition attribute), the greater the gravity (decision attribute), which is a specific feature of decision rules discovered from data using the Dominance-based Rough Set Approach (DRSA). While in DRSA one has to suppose a priori the presence or absence of positive or negative monotonicity relationships which hold in the whole evaluation space, in this paper, we show that DRSA can be adapted to discover rules from any kind of input classification data, exhibiting monotonicity relationships which are unknown a priori and hold in some parts of the evaluation space only. This requires a proper non-invasive transformation of the classification data, permitting representation of both positive and negative monotonicity relationships that are to be discovered by the proposed methodology. Reported results of a computational experiment confirm that the proposed methodology leads to decision rules whose predictive ability is similar to the best classification predictors. It has, however, a unique advantage over all competitors because the monotonic decision rules can be read as laws characterizing the analyzed phenomena in terms of easily understandable “if..., then...” decision rules, while other predictor models have no such straightforward interpretation.  相似文献   

9.
In this work we propose a generalization of the notion of directional monotonicity. Instead of considering increasingness or decreasingness along rays, we allow more general paths defined by curves in the n-dimensional space. These considerations lead us to the notion of α-monotonicity, where α is the corresponding curve. We study several theoretical properties of α-monotonicity and relate it to other notions of monotonicity, such as weak monotonicity and directional monotonicity.  相似文献   

10.
We empirically investigated the difficulty of finding stable models for logic programs using backtracking, by trying to identify what makes random instances easy or hard. Additionally, we empirically investigated the effectiveness of the 4‐valued Kripke–Kleene semantics (4KK) and the 4‐valued well‐founded semantics (4WF) in the Niemelä and Simons’ backtracking algorithm, smodels, for finding stable models. We studied the behavior of 4KK and 4WF in a parameterized distribution of random propositional logic programs of fixed rule‐length k. In all of our experiments, 4KK and 4WF (both modified to extend an input partial truth assignment) were computed with respect to a fixed percentage of proposition letters (randomly chosen) initially assigned TRUE and a fixed percentage (randomly chosen) initially assigned FALSE. There exists a region, R, in the parameter space of our distribution where smodels required a large number of recursive calls to determine if programs generated in this region have any stable models. Hence, the “hardest” programs for smodels to determine if a stable model exists lie in R. Additionally, there exists a subregion of R where smodels made significantly fewer recursive calls when using 4WF as a pruning technique than when using 4KK. To gain a deeper insight into the causes for the “hardness” of programs in R and the differences between 4WF and 4KK as pruning techniques in smodels, we examined more closely the behavior of 4KK and 4WF. There exists a region in which a very small percentage of inconsistent models were produced by both 4KK and 4WF, thereby providing very little information useful for smodels to immediately backtrack. This region roughly corresponded to the above region where smodels required a large number of recursive calls. Also, there exists a region in which both 4KK and 4WF produced a high percentage of inconsistent models, thereby providing information useful for smodels to immediately backtrack.  相似文献   

11.
Blair et al. (2001) developed an extension of logic programming called set based logic programming. In the theory of set based logic programming the atoms represent subsets of a fixed universe X and one is allowed to compose the one-step consequence operator with a monotonic idempotent operator O so as to ensure that the analogue of stable models in the theory are always closed under O. Marek et al. (1992, Ann Pure Appl Logic 96:231–276 1999) developed a generalization of Reiter’s normal default theories that can be applied to both default theories and logic programs which is based on an underlying consistency property. In this paper, we show how to extend the normal logic programming paradigm of Marek, Nerode, and Remmel to set based logic programming. We also show how one can obtain a new semantics for set based logic programming based on a consistency property.  相似文献   

12.
Many natural systems exhibit a hybrid behavior characterized by a set of continuous laws which are switched by discrete events. Such behaviors can be described in a very natural way by a class of automata called hybrid automata. Their evolution are represented by both dynamical systems on dense domains and discrete transitions. Once a real system is modeled in a such framework, one may want to analyze it by applying automatic techniques, such as Model Checking or Abstract Interpretation. Unfortunately, the discrete/continuous evolutions not only provide hybrid automata of great flexibility, but they are also at the root of many undecidability phenomena. This paper addresses issues regarding the decidability of the reachability problem for hybrid automata (i.e., “can the system reach a state a from a state b?”) by proposing an “inaccurate” semantics. In particular, after observing that dense sets are often abstractions of real world domains, we suggest, especially in the context of biological simulation, to avoid the ability of distinguishing between values whose distance is less than a fixed ε. On the ground of the above considerations, we propose a new semantics for first-order formulæ which guarantees the decidability of reachability. We conclude providing a paradigmatic biological example showing that the new semantics mimics the real world behavior better than the precise one.  相似文献   

13.
This paper presents a methodology to extract information from a set of r labeled counts with constant sum N called an “Organ Pipe Diagram (OPD)”. A histogram is a particular OPD with an order relation on its count labels.As an application of this methodology, if a global image of radiometries is scanned by a window, a “module” value and a “state” can be defined for the local histogram. The corresponding maps of modules and states yield a useful local spatial information closely related to texture. The methodology is interesting in itself, as it shows how a geometry can be built-up from a set of labeled counts whose space of definition is shown to be a simplex.  相似文献   

14.
The formalism of nonmonotonic reasoning has been integrated into logic programming to define semantics for logic program with negation. Because a Petri net provides a uniform model for both the logic of knowledge and the control of inference, the class of high-level Petri nets called predicate/transition nets (PrT-nets) has been employed to study production rule based expert systems and Horn clause logic programs. We show that a PrT-net can implement the nonmonotonicity associated with a logic program with negation as well as the monotonicity of Horn clause logic program. In particular, we define a semantics for a normal logic program and implement it with PrT-net. We demonstrate that in the presence of inconsistency in a normal logic program, the semantics still works well by deducing meaningful answers. The variations and potential applications of the PrT-net are also addressed  相似文献   

15.
In the literature on logics of imperfect information it is often stated, incorrectly, that the Game-Theoretical Semantics of Independence-Friendly (IF) quantifiers captures the idea that the players of semantical games are forced to make some moves without knowledge of the moves of other players. We survey here the alternative semantics for IF logic that have been suggested in order to enforce this “epistemic reading” of sentences. We introduce some new proposals, and a more general logical language which distinguishes between “independence from actions” and “independence from strategies”. New semantics for IF logic can be obtained by choosing embeddings of the set of IF sentences into this larger language. We compare all the semantics proposed and their purported game-theoretical justifications, and disprove a few claims that have been made in the literature.  相似文献   

16.
17.
Beam search is a heuristic search algorithm that explores a state-space graph by expanding w most promising nodes at each level (depth) of the graph, where w is called the beam-width which is taken as input from the user. The quality of the solution produced by beam search does not always monotonically improve with the increase in beam-width making it difficult to choose an appropriate beam-width for effective use. We present an algorithm called Incremental Beam Search (IncB) which guarantees monotonicity, and is also anytime in nature. Experimental results on the sliding-tile puzzle, the traveling salesman, and the single-machine scheduling problems show that IncB significantly outperforms basic monotonic methods such as iterative widening beam search as well as some of the state-of-the-art anytime heuristic search algorithms in terms of the quality of the solution produced at the end as well as the anytime performance.  相似文献   

18.
Conventional Information Systems are limited in their ability to represent uncertain data. A consistent and useful methodology for representing and manipulating such data is required. One solution is proposed in this paper. Objects are modeled by selecting representative attributes to which values are assigned. Any attribute value can be one of the following: a regular precise value, a special value denoting “value unknown”, a special value denoting “attribute not applicable”, a range of values or a set of values. If there are uncertain data then the semantics of query evaluation are no longer clear and uncertainty is introduced. To handle the uncertainty two sets of objects are retrieved in response to each query: the set know to satisfy the query with complete certainty, and the set of objects which possibly satisfy the query with some degree of uncertainty. Two methods of estimating this uncertainty are examined.  相似文献   

19.
The problem of managing and querying inconsistent databases has been deeply investigated in the last few years. As the problem of consistent query answering is hard in the general case, most of the techniques proposed so far have an exponential complexity. Polynomial techniques have been proposed only for restricted forms of constraints (such as functional dependencies) and queries. In this paper, a technique for computing “approximate” consistent answers in polynomial time is proposed, which works in the presence of a wide class of constraints (namely, full constraints) and Datalog queries. The proposed approach is based on a repairing strategy where update operations assigning an undefined truth value to the “reliability” of tuples are allowed, along with updates inserting or deleting tuples. The result of a repair can be viewed as a three-valued database which satisfies the specified constraints. In this regard, a new semantics (namely, partial semantics) is introduced for constraint satisfaction in the context of three-valued databases, which aims at capturing the intuitive meaning of constraints under three-valued logic. It is shown that, in order to compute “approximate” consistent query answers, it suffices to evaluate queries by taking into account a unique repair (called deterministic repair), which in some sense “summarizes” all the possible repairs. The so obtained answers are “approximate” in the sense that are safe (true and false atoms in the answers are, respectively, true and false under the classical two-valued semantics), but not complete.  相似文献   

20.
In supervised prediction problems, the response attribute depends on certain explanatory attributes. Some real problems require the response attribute to represent ordinal values that should increase with some of the explaining attributes. They are called classification problems with monotonicity constraints. In this paper, we aim at formalizing the approach to nested generalized exemplar learning with monotonicity constraints, proposing the monotonic nested generalized exemplar learning (MoNGEL) method. It accomplishes learning by storing objects in \({\mathbb {R}}^n\), hybridizing instance-based learning and rule learning into a combined model. An experimental analysis is carried out over a wide range of monotonic data sets. The results obtained have been verified by non-parametric statistical tests and show that MoNGEL outperforms well-known techniques for monotonic classification, such as ordinal learning model, ordinal stochastic dominance learner and k-nearest neighbor, considering accuracy, mean absolute error and simplicity of constructed models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号