首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper surveys complexity, degree of uncomputability, and expressive power results for logic programming. Some major decision problem complexity results and other results for logic programming are also covered. It also proves several new results filling in previous gaps in the literature. The paper considers seven logic programming semantics: the van Emden-Kowalski semantics for definite (Horn) logic programs; the perfect model semantics for stratified and for locally stratified logic programs; and the two- and three-valued program completion semantics, the well-founded semantics, and the stable semantics, all for normal logic programs, under skeptical inference. The main results concern expressibility and query complexity/uncomputability in five contexts: for propositional logic programs, for first order logic programs with infinite Herbrand universes on their Herbrand universes (a closed domain assumption), for first order logic programs with infinite Herbrand universes on those universes extended with infinitely many new elements (an open domain assumption), and for logic programs without function or constant symbols evaluated over varying extensional databases (DATALOG-type results, data complexity results only) under both closed and open domain assumptions. Several of the open domain assumption results are new to this paper. Other results surveyed are (1) results about the family of all stable models of a program and (2) decision questions about when a logic program has nice properties with respect to a semantics (e.g., a unique stable model). One decision result, for well-founded semantics, is new to this paper.Work supported in part by NSF grant IRI-8905166.  相似文献   

2.
We give a complexity analysis of a variety of languages across the spectrum of the CLP scheme. By varying the logic and memory management, the role of the constraints and the role of the logic can be measured. The analysis clarifies the relation between linear/integer programming and constraint logic programming. We also determine how the power of constraints can easily lead to undecidable queries in Datalog languages with constraints. This work is motivated in large part by the problems of efficient implementation of CLP languages and the concomitant need for low level constraint languages.Research partially supported by PSC-CUNY Grant 669287.Research partially supported by NSF Grant IRI-8902511.  相似文献   

3.
In general, the set of stable models of a recursive logic program can be quite complex. For example, it follows from results of Marek, Nerode, and Remmel [Ann. Pure and Appl. Logic (1992)] that there exists finite predicate logic programs and recursive propositional logic programs which have stable models but no hyperarithmetic stable models. In this paper, we shall define several conditions which ensure that recursive logic program P has a stable model which is of low complexity, e.g., a recursive stable model, a polynomial time stable model, or a stable model which lies in a low level of the polynomial time hierarchy.  相似文献   

4.
Support functions and samples of convex bodies in R n are studied with regard to conditions for their validity or consistency. Necessary and sufficient conditions for a function to be a support function are reviewed in a general setting. An apparently little known classical such result for the planar case due to Rademacher and based on a determinantal inequality is presented and a generalization to arbitrary dimensions is developed. These conditions are global in the sense that they involve values of the support function at widely separated points. The corresponding discrete problem of determining the validity of a set of samples of a support function is treated. Conditions similar to the continuous inequality results are given for the consistency of a set of discrete support observations. These conditions are in terms of a series of local inequality tests involving only neighboring support samples. Our results serve to generalize existing planar conditions to arbitrary dimensions by providing a generalization of the notion of nearest neighbor for plane vectors which utilizes a simple positive cone condition on the respective support sample normals.This work partially supported by the Center for Intelligent Control Systems under the U.S. Army Research Office Grant DAAL03-92-G-0115, the Office of Naval Research under Grant N00014-91-J-1004, and the National Science Foundation under Grant MIP-9015281.Partially supported by the National Science Foundation under grant IRI-9209577 and by the U.S. Army Research Office under grant DAAL03-92-G-0320  相似文献   

5.
The view update problem is considered in the context of deductive databases where the update of an intensional predicate is accomplished by modifying appropriately the underlying relations in the extensional database. Two classes of disjunctive databases are considered. The first class contains those disjunctive databases which allow only definite rules in the intensional database and disjunctive facts in the extensional database. The second class contains stratified disjunctive databases so that in addition to the first class, negation is allowed in the bodies of the rules, but the database must be stratified. Algorithms are given both for the insertion of an intensional predicate into and the deletion of an intensional predicate from the database. The algorithms use SLD resolution and the concept of minimal models of the extensional database. The algorithms are proved to be correct and best according to the criterion of causing minimal change to the database, where we give first priority to minimizing deletions.Research supported by the National Science Foundation under grant numbers IRI-8916059, IRI-8921591, IRI-9200898, and IRI-9210220.  相似文献   

6.
A logic for reasoning with inconsistency   总被引:4,自引:0,他引:4  
Most known computational approaches to reasoning have problems when facing inconsistency, so they assume that a given logical system is consistent. Unfortunately, the latter is difficult to verify and very often is not true. It may happen that addition of data to a large system makes it inconsistent, and hence destroys the vast amount of meaningful information. We present a logic, called APC (annotated predicate calculus; cf. annotated logic programs of [4, 5]), that treats any set of clauses, either consistent or not, in a uniform way. In this logic, consequences of a contradiction are not nearly as damaging as in the standard predicate calculus, and meaningful information can still be extracted from an inconsistent set of formulae. APC has a resolution-based sound and complete proof procedure. We also introduce a novel notion of epistemic entailment and show its importance for investigating inconsistency in predicate calculus as well as its application to nonmonotonic reasoning. Most importantly, our claim that a logical theory is an adequate model of human perception of inconsistency, is actually backed by rigorous arguments.A preliminary report on this research appeared in LICS'89.Work of M. Kifer was supported in part by the NSF grants DCR-8603676, IRI-8903507.Work of E. L. Lozinskii was supported in part by Israel National Council for Research and Development under the grants 2454-3-87, 2545-2-87, 2545-3-89 and by Israel Academy of Science, grant 224-88.  相似文献   

7.
8.
In this paper, we deal with the problem of verifying local stratifiability of logic programs and databases presented by Przymusinski. The notion of dependency graphs is generalized from representing the priority relation between predicate symbols to representing the priority between atoms. Necessary and sufficient conditions for the local stratifiability of logic programs are presented and algorithms for performing the verification are developed. Finally, we prove that a database DB containing clauses with disjunctive consequents can easily be converted into a logic program P such that DB is locally stratified iff P is locally stratified. Yi-Dong Shen, Dr.: Department of Computer Science, Chongqing University, Chongqing, 630044, P.R. China (Present Address) c/o Ping Ran, Department of Heat Power Engineering, Chongqing UniversityResearch interests: Artificial Intelligence, Deductive Databases, Logic Programming, Non-Monotonic Reasoning, Parallel Processing  相似文献   

9.
To solve a problem one may need to combine the knowledge of several different experts. It can happen that some of the claims of one or more experts may be in conflict with the claims of other experts. There may be several such points of conflict and any claim may be involved in several different such points of conflict. In that case, the user of the knowledge of experts may prefer a certain claim to another in one conflict-point without necessarily preferring that statement in another conflict-point.Our work constructs a framework within which the consequences of a set of such preferences (expressed as priorities among sets of statements) can be computed. We give four types of semantics for priorities, three of which are shown to be equivalent to one another. The fourth type of semantics for priorities is shown to be more cautious than the other three. In terms of these semantics for priorities, we give a function for combining knowledge from different sources such that the combined knowledge is conflict-free and satisfies all the priorities.Jack Minker and Shekhar Pradhan were supported in part by the National Science Foundation grant IRI-89-16059 and Air Force Office of Scientific Research grant 91-0350. V.S. Subrahmanian was supported in part by Army Research Office grant DAAL-03-92-G-0225, Air Force Office of Scientific Research Grant F49620-93-1-0065, and NSF grant IRI-9109755.  相似文献   

10.
Near-Horn Prolog and beyond   总被引:1,自引:0,他引:1  
Near-Horn Prolog is an extension of Prolog designed to handle disjunction and classical negation. The emphasis here is on minimal change from standard Prolog in regard to notation, derivation form, and speed of inner-loop computation. The procedure is optimized for the input program that is near-Horn; i.e., a program where almost all clauses are definite clauses. This paper goes beyond the near-Horn focus to report on the completeness of one version of nH-Prolog, along with soundness of the procedure. Completeness is important here not only for the usual reasons of guaranteed success on small problems and insight into the behavior of the procedure but also because we anticipate the introduction of negation-as-failure which requires conviction that a proof will be found if a proof exists.This research was supported in part by the U.S. Army Research Office under grants DAAG29-84-K-0072 and DAAL03-88-K-0082 and by NSF Grant IRI-8805696.  相似文献   

11.
Results of Schlipf (J Comput Syst Sci 51:64?C86, 1995) and Fitting (Theor Comput Sci 278:25?C51, 2001) show that the well-founded semantics of a finite predicate logic program can be quite complex. In this paper, we show that there is a close connection between the construction of the perfect kernel of a $\Pi^0_1$ class via the iteration of the Cantor?CBendixson derivative through the ordinals and the construction of the well-founded semantics for finite predicate logic programs via Van Gelder??s alternating fixpoint construction. This connection allows us to transfer known complexity results for the perfect kernel of $\Pi^0_1$ classes to give new complexity results for various questions about the well-founded semantics ${\mathit{wfs}}(P)$ of a finite predicate logic program P.  相似文献   

12.
We continue our investigations and study automated theorem proving for reasoning about perception of reasoning agents and their consensus reaching. Using our earlier techniques and those of logic programming we develop the processing techniques for consensus programs.Work partially supported by Polish Goverment grant KBN 2 2051 91 02.Work partially supported by U.S. National Science Foundation grant IRI-9012902.  相似文献   

13.
In this paper we consider the problem of using disk blocks efficiently in searching graphs that are too large to fit in internal memory. Our model allows a vertex to be represented any number of times on the disk in order to take advantage of redundancy. We give matching upper and lower bounds for completed-ary trees andd-dimensional grid graphs, as well as for classes of general graphs that intuitively speaking have a close to uniform number of neighbors around each vertex. We also show that, for the special case of grid graphs blocked with isothetic hypercubes, there is a provably better speed-up if even a small amount of redundancy is permitted.Support was provided in part by an IBM Graduate Fellowship, by NSF Research Grants CCR-9007851 and IRI-9116451, and by Army Research Office Grant DAAL03-91-G-0035.Support was provided in part by NSF Grants CCR-9003299, CCR-9300079, and IRI-9116843, and by NSF/DARPA Grant CCR-8908092.Support was provided in part by a National Science Foundation Presidential Young Investigator Award CCR-9047466 with matching funds from IBM, by NSF Research Grant CCR-9007851, and by Army Research Office Grant DAAL03-91-G-0035.  相似文献   

14.
We show that the algorithm directly induced by the viability definition in Ref. [4] does not terminate in general. As a consequence, RUE-resolution in strong form is not complete. Moreover, we show that ground query processing forcovered pure logic programs can be reduced to computing viability. Since the problem of ground query processing is strictly recursively enumerable even under the above restrictions, it follows that the notion of viability is undecidable. Finally, we present a modified viability check that solves the non-termination problem for ground terms.Work supported in part by NSF grants IRI-9015251 and IRI-9109755 and by Army Research office grant DAAL-03-92-G-0225.  相似文献   

15.
This article addresses the problem of indexing and retrieving first-order predicate calculus terms in the context of automated deduction programs. The four retrieval operations of concern are to find variants, generalizations, instances, and terms that unify with a given term. Discrimination-tree indexing is reviewed, and several variations are presented. The path-indexing method is also reviewed. Experiments were conducted on large sets of terms to determine how the properties of the terms affect the performance of the two indexing methods. Results of the experiments are presented.This was supported by the Applied Mathematical Sciences subprogram of the Office of Energy Research, U.S. Department of Energy, under Contract W-31-109-Eng-38.  相似文献   

16.
Przmusinski extended the notion of stratified logic programs,developed by Apt,Blair and Walker,and by van Gelder,to stratified databases that allow both negative premises and disjunctive consequents.However,he did not provide a fixpoint theory for such class of databases.On the other hand,although a fixpoint semantics has been developed by Minker and Rajasekar for non-Horn logic programs,it is tantamount to traditional minimal model semantics which is not sufficient to capture the intended meaning of negation in the premises of clauses in stratified databases.In this paper,a fixpoint approach to stratified databases is developed,which corresponds with the perfect model semantics.Moreover,algorithms are proposed for computing the set of perfect models of a stratified database.  相似文献   

17.
We define a class of function-free rule-based production system (PS) programs that exhibit non-deterministic and/or causal behavior. We develop a fixpoint semantics and an equivalent declarative semantics for these programs. The criterion to recognize the class of non-deterministic causal (NDC) PS programs is based upon extending and relaxing the concept of stratification, to partition the rules of the program. Unlike strict stratification, this relaxed stratification criterion allows a more flexible partitioning of the rules and admits programs whose execution is non-deterministic or causal or both. The fixpoint semantics is based upon a monotonic fixpoint operator which guarantees that the execution of the program will terminate. Each fixpoint corresponds to a minimal database of answers for the NDC PS program. Since the execution of the program is non-deterministic, several fixpoints may be obtained. To obtain a declarative meaning for the PS program, we associate a normal logic program with each NDC PS program. We use the generalized disjunctive well-founded semantics to provide a meaning to the normal logic program Through these semantics, a well-founded state is associated with and a set of possible extensions, each of which are minimal models for the well-founded state, are obtained. We show that the fixpoint semantics for the NDC PS programs is sound and complete with respect to the declarative semantics for the corresponding normal logic program .This research is partially sponsored by the National Science Foundation under grant IRI-9008208 and by the Institute for Advanced Computer Studies.  相似文献   

18.
The Gröbner basis method is a powerful tool in automated geometry theorem proving. Normally, one works in the ring of coordinates of the points in a particular configuration. Tim Havel has suggested using instead the ring of interpoint squared distances because it is the invariant subring under the group of Euclidean isometries. One difficulty with this approach is that it is not always clear how to express some invariants in terms of squared distances. To that end, we present a new straightening algorithm for Euclidean invariants. We will also prove the first and second fundamental theorems of vector invariants for the group of Euclidean isometries (that the invariant subring is a finitely generated algebra over the reals, and that it can be expressed as a polynomial ring modulo finitely generated ideal, respectively. Another difficulty is that the ring of interpoint squared distances must be represented as the quotient of a polynomial ring by an ideal. Unfortunately, no canonical Gröbner basis for this ideal is known. We will present a candidate for such a basis and prove that it is a basis in some cases.This work was supported by the U.S. Army Research Office through the ACSyAm branch of the Mathematical Sciences Institute of Cornell University, Contract DA AL03-91-C-0027.  相似文献   

19.
This article is the twenty-fourth of a series of articles discussing various open research problems in automated reasoning. The problem proposed for research asks one to find an appropriate theory for modulating across argument and across literal boundaries. Because demodulation has proved so useful—is most cases, even crucial—to automated reasoning, extending this concept to permit canonicalization to be applied at the predicate and at the clause and subclause levels merits exploration. For evaluating a proposed solution to this research problem, we suggest problems from mathematics, logic, program verification, database inquiry, and the world of puzzles.This work was supported by the Applied Mathematical Sciences subprogram of the Office of Energy Research, U.S. Department of Energy, under Contract W-31-109-Eng-38.  相似文献   

20.
This paper presents a parallel execution model for exploiting AND-parallelism in Horn Clause logic programs. The model is based upon the generator-consumer approach, and can be implemented efficiently with small run-time overhead. Other related models that have been proposed to minimize the run-time overhead are unable to exploit the full parallelism inherent in the generator-consumer approach. Furthermore, our model performs backtracking more intelligently than these models. We also present two implementation schemes to realize our model: one has a coordinator to control the activities of processes solving different literals in the same clause; and the other achieves synchronization by letting processes pass messages to each other in a distributed fashion. Trade-offs between these two schemes are then discussed. This work was supported by Army Research Office grant #DAAG29-84-K-0060 to the Artificial Intelligence Laboratory at the University of Texas at Austin.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号