首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9篇
  免费   0篇
自动化技术   9篇
  2013年   1篇
  2008年   1篇
  2002年   2篇
  2000年   1篇
  1999年   1篇
  1998年   1篇
  1997年   1篇
  1996年   1篇
排序方式: 共有9条查询结果,搜索用时 15 毫秒
1
1.
This paper deals with four solvers for combinatorial problems: the commercial state-of-the-art solver ILOG oplstudio, and the research answer set programming (ASP) systems dlv, smodels and cmodels. The first goal of this research is to evaluate the relative performance of such systems when used in a purely declarative way, using a reproducible and extensible experimental methodology. In particular, we consider a third-party problem library, i.e., the CSPLib, and uniform rules for modelling and instance selection. The second goal is to analyze the marginal effects of popular reformulation techniques on the various solving technologies. In particular, we consider structural symmetry breaking, the adoption of global constraints, and the addition of auxiliary predicates. Finally, we evaluate, on a subset of the problems, the impact of numbers and arithmetic constraints on the different solving technologies. Results show that there is not a single solver winning on all problems, and that reformulation is almost always beneficial: symmetry-breaking may be a good choice, but its complexity has to be carefully chosen, by taking into account also the particular solver used. Global constraints often, but not always, help opl, and the addition of auxiliary predicates is usually worth, especially when dealing with ASP solvers. Moreover, interesting synergies among the various modelling techniques exist.  相似文献   
2.
The high computational complexity of advanced reasoning tasks such as reasoning about knowledge and planning calls for efficient and reliable algorithms for reasoning problems harder than NP. In this paper we propose Evaluate, an algorithm for evaluating quantified Boolean formulae (QBFs). Algorithms for evaluation of QBFs are suitable for experimental analysis of problems that belong to a wide range of complexity classes, a property not easily found in other formalisms. Evaluate is a generalization of the Davis–Putnam procedure for SAT and is guaranteed to work in polynomial space. Before presenting the algorithm, we discuss several abstract properties of QBFs that we singled out to make it more efficient. We also discuss various options that were investigated about heuristics and data structures and report the main results of the experimental analysis. In particular, Evaluate is orders of magnitude more efficient than a nested backtracking procedure that resorts to a Davis–Putnam algorithm for handling the innermost set of quantifiers. Moreover, experiments show that randomly generated QBFs exhibit regular patterns such as phase transition and easy-hard-easy distribution.  相似文献   
3.
Default logic as a query language   总被引:1,自引:0,他引:1  
Research in nonmonotonic reasoning has focused largely on the idea of representing knowledge about the world via rules that are generally true but can be defeated. Even if relational databases are nowadays the main tool for storing very large sets of data, the approach of using nonmonotonic AI formalisms as relational database query languages has been investigated to a much smaller extent. In this work, we propose a novel application of Reiter's default logic by introducing a default query language (DQL) for finite relational databases, which is based on default rules. The main result of this paper is that DQL is as expressive as SO∃∀ the existential-universal fragment of second-order logic. This result is not only of theoretical importance: We exhibit queries-which are useful in practice-that can be expressed with DQL and cannot with other query languages based on nonmonotonic logics such as DATALOG with negation under the stable model semantics. In particular, we show that DQL is well-suited for diagnostic reasoning  相似文献   
4.
Many formalisms for reasoning about knowing commit an agent to be logically omniscient. Logical omniscience is an unrealistic principle for us to use to build a real-world agent, since it commits the agent to knowing infinitely many things. A number of formalizations of knowledge have been developed that do not ascribe logical omniscience to agents. With few exceptions, these approaches are modifications of the possible-worlds semantics. In this paper we use a combination of several general techniques for building non-omniscient reasoners. First we provide for the explicit representation of notions such as problems, solutions, and problem solving activities, notions which are usually left implicit in the discussions of autonomous agents. A second technique is to take explicitly into account the notion of resource when we formalize reasoning principles. We use the notion of resource to describe interesting principles of reasoning that are used for ascribing knowledge to agents. For us, resources are abstract objects. We make extensive use of ordering and inaccessibility relations on resources, but we do not find it necessary to define a metric. Using principles about resources without using a metric is one of the strengths of our approach.We describe the architecture of a reasoner, built from a finite number of components, who solves a puzzle, involving reasoning about knowing, by explicitly using the notion of resource. Our approach allows the use of axioms about belief ordinarily used in problem solving – such as axiom K of modal logic – without being forced to attribute logical omniscience to any agent. In particular we address the issue of how we can use resource-unbounded (e.g., logically omniscient) reasoning to attribute knowledge to others without introducing contradictions. We do this by showing how omniscient reasoning can be introduced as a conservative extension over resource-bounded reasoning.  相似文献   
5.
Multivalued logics have a long tradition in the philosophy and logic literature that originates from the work by ukaszewicz in the 1920s. More recently, many AI researchers have been interested in this topic for both semantic and computational reasons. Multivalued logics have indeed been frequently used both for their semantic properties and as tools for designing tractable reasoning systems. We focus here on the computational aspects of multivalued logics. The main result of this paper is a detailed picture of the impact that the semantic definition, the synthactic form and the assumptions on the relative sizes of the inputs have on the complexity of entailment checking. In particular we show new polynomial cases and generalize polynomial cases already known in the literature for various popular multivalued logics. Such polynomial cases are obtained by means of two simple algorithms, sharing a common method.  相似文献   
6.
The efficiency of systems for constraint programming (CP) is currently highly affected by the actual formulation of the input problem. To this end, several choices have to be made by modelers in order to write efficient specifications and handle instances of realistic size, and this, of course, represents a major obstacle to reach full declarativeness. Several structural properties of problem specifications have been investigated in order to provide techniques that reformulate a constraint program into one which is more efficiently evaluable by the solver at hand. In this paper we consider two such properties, symmetries and functional dependencies among variables, and show that, by characterizing problem specifications as logical formulae, the task of deciding whether such properties hold, and consequently that of performing the relevant reformulations, can be practically mechanized by means of automated theorem proving (ATP) technology. In particular, we report the results on using ATP technology for checking the existence of symmetries, checking whether a given constraint is symmetry-breaking, and checking the existence of functional dependencies in a specification. The output of the reasoning phase is a transformed constraint program, consisting in a reformulated specification and, possibly, a search strategy. We show our techniques on problems such as graph coloring, Sailco inventory, and protein folding.  相似文献   
7.
Propositional greatest lower bounds (GLBs) are logically‐defined approximations of a knowledge base. They were defined in the context of Knowledge Compilation, a technique developed for addressing high computational cost of logical inference. A GLB allows for polynomial‐time complete on‐line reasoning, although soundness is not guaranteed. In this paper we propose new algorithms for the generation of a GLB. Furthermore, we give precise characterization of the computational complexity of the problem of generating such lower bounds, thus addressing in a formal way the question “how many queries are needed to amortize the overhead of compilation?”  相似文献   
8.
Local search is an emerging paradigm for combinatorial search which has recently been shown to be very effective for a large number of combinatorial problems. It is based on the idea of navigating the search space by iteratively stepping from one solution to one of its neighbors, which are obtained by applying a simple local change to it. In this paper we present LOCAL++, an object‐oriented framework to be used as a general tool for the development and implementation of local search algorithms in C++. The framework comprises a hierarchy of abstract template classes, one for each local search technique taken into account (i.e. hill‐climbing, simulated annealing and tabu search). Each class specifies and implements the invariant part of the algorithm built according to the technique, and is supposed to be specialized by a concrete class once a given search problem is considered, so as to implement the problem‐dependent part of the algorithm. LOCAL++ comprises also a set of abstract classes for creating new techniques by combining different search techniques and different neighborhood relations. The architecture of LOCAL++ provides a principled modularization for the solution of combinatorial search problems, and helps the designer deriving a neat conceptual scheme of the application, thus facilitating the development and debugging phases. LOCAL++ proved to be flexible enough for the implementation of the algorithms solving various scheduling problems. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   
9.
Some computationally hard problems, e.g., deduction in logical knowledge bases– are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is useful to preprocess off-line this known part so as to simplify the remaining on-line problem. In this paper we investigate such a technique in the context of intractable, i.e., NP-hard, problems. Recent results in the literature show that not all NP-hard problems behave in the same way: for some of them preprocessing yields polynomial-time on-line simplified problems (we call them compilable), while for other ones their compilability implies some consequences that are considered unlikely. Our primary goal is to provide a sound methodology that can be used to either prove or disprove that a problem is compilable. To this end, we define new models of computation, complexity classes, and reductions. We find complete problems for such classes, “completeness” meaning they are “the less likely to be compilable.” We also investigate preprocessing that does not yield polynomial-time on-line algorithms, but generically “decreases” complexity. This leads us to define “hierarchies of compilability,” that are the analog of the polynomial hierarchy. A detailed comparison of our framework to the idea of “parameterized tractability” shows the differences between the two approaches.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号