首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We study the declarative formalization of reasoning strategies by presenting declarative formalizations of: (1) the SNLP algorithm for nonlinear planning, and (2) a particular algorithm for blocks world nonlinear planning proposed in this paper. The formal models of a heuristic forward chaining planner, which can take advantage of declarative formalizations of action selection strategies, and of a reasoning strategy based planner, which can use declarative formalizations of reasoning strategies, are proposed. The effectiveness of these systems is studied from formal and empirical points of view. Empirical results showing how the use of declarative formalizations of reasoning strategies can reduce the amount of search required for solving planning problems (with respect to state of the art planning systems) are presented.  相似文献   

2.
NKI中的本体、框架和逻辑理论   总被引:2,自引:0,他引:2  
眭跃飞  高颖  曹存根 《软件学报》2005,16(12):2045-2053
NKI(国家知识基础设施)是一个大规模知识库,它用框架来表示本体中的概念,用Hom逻辑程序作为自动推理.给出NKI中的本体、框架和逻辑理论的形式表示以及形式表示之间的转换,并证明如果将本体、框架和逻辑理论看作是3个范畴,则这些转换是这3个范畴之间的函子.这个结果保证了在NKI中,基于Horn逻辑程序的推理关于用本体和框架表示的知识库是正确的.  相似文献   

3.
UML缺乏精确的语义,难以对其所表示的系统进行形式化分析和一致性检验.为了使UML能够更精确地对系统模型进行描述,学者们提出了一些形式化的方法.论文对比分析了用Petri网、时序逻辑XYE/E和动态描述逻辑形式化UML状态图的方法,指出了它们各自的优缺点以及应用领域.  相似文献   

4.
Geometric problems defined by constraints can be represented by geometric constraint graphs whose nodes are geometric elements and whose arcs represent geometric constraints. Reduction and decomposition are techniques commonly used to analyze geometric constraint graphs in geometric constraint solving.In this paper we first introduce the concept of deficit of a constraint graph. Then we give a new formalization of the decomposition algorithm due to Owen. This new formalization is based on preserving the deficit rather than on computing triconnected components of the graph and is simpler. Finally we apply tree decompositions to prove that the class of problems solved by the formalizations studied here and other formalizations reported in the literature is the same.  相似文献   

5.
The aim of this study is to look at the the syntactic calculus of Bar-Hillel and Lambek, including semantic interpretation, from the point of view of constructive type theory. The syntactic calculus is given a formalization that makes it possible to implement it in a type-theoretical proof editor. Such an implementation combines formal syntax and formal semantics, and makes the type-theoretical tools of automatic and interactive reasoning available in grammar.In the formalization, the use of the dependent types of constructive type theory is essential. Dependent types are already needed in the semantics of ordinary Lambek calculus. But they also suggest some natural extensions of the calculus, which are applied to the treatment of morphosyntactic dependencies and to an analysis of selectional restrictions. Finally, directed dependent function types are introduced, corresponding to the types of constructive type theory.Two alternative formalizations are given: one using syntax trees, like Montague grammar, and one dispensing with them, like the theory called minimalistic by Morrill. The syntax tree approach is presented as the main alternative, because it makes it possible to embed the calculus in a more extensive Montague-style grammar.  相似文献   

6.
We investigate two formalizations of Optimality Theory, a successful paradigm in linguistics.We first give an order-theoretic counterpart for the data and processinvolved in candidate evaluation.Basically, we represent each constraint as a function that assigns every candidate a degree of violation.As for the second formalization, we define (after Samek-Lodovici and Prince) constraints as operations that select the best candidates out of a set of candidates.We prove that these two formalizations are equivalent (accordingly, there is no loss of generality with using violation marks and dispensing with them is only apparent).Importantly, we show that the second formalization is equivalent with a class of operations over sets of formulas in a given logical language.As a result, we prove that Optimality Theory can be characterized by certain cumulative logics.So, applying Optimality Theory is shown to be reasoning by the rules of cumulative logics.  相似文献   

7.
Cognitive situation awareness has recently caught the attention of the information fusion community. Some approaches have developed formalizations that are both ontology-based and underpinned with Situation Theory. While the semantics of Situation Theory is very attractive from the cognitive point of view, the languages that are used to express knowledge and to reason with suffer from a number of limitations concerning both expressiveness and reasoning capabilities. In this paper we propose a more general formal foundation denoted S-DTT (Situation-based Dependent Type Theory) that is expressed with the language of the Extended Calculus of Constructions (ECC), a widely used theory in mathematical formalization and in software validation. Situation awareness relies on small blocks of knowledge called situation fragment types whose composition leads to a very expressive and unifying theory. The semantic part is provided by an ontology that is rooted in the S-DTT theory and, on which higher-order reasoning can be performed. The basis of the theory is summarized and its expressing power is illustrated with numerous examples. A scenario in the healthcare context for patient safety issues is detailed and a comparison with well-known approaches is discussed.  相似文献   

8.
This article presents formalizations in higher-order logic of two proofs of Arrow’s impossibility theorem due to Geanakoplos. The Gibbard-Satterthwaite theorem is derived as a corollary. Lacunae found in the literature are discussed.  相似文献   

9.
石黎  林仙 《微计算机信息》2006,22(35):210-212
对上下文推理的原理和形式进行了阐述,讨论了上下文推理的形式化问题,并给出一个利用MCS对问题进行形式化表示和求解的实例。  相似文献   

10.
11.
ContextThe Business Process Model and Notation (BPMN) standard informally defines a precise execution semantics. It defines how process instances should be updated in a model during execution. Existing formalizations of the standard are incomplete and rely on mappings to other languages.ObjectiveThis paper provides a BPMN 2.0 semantics formalization that is more complete and intuitive than existing formalizations.MethodThe formalization consists of in-place graph transformation rules that are documented visually using BPMN syntax. In-place transformations update models directly and do not require mappings to other languages. We have used a mature tool and test-suite to develop a reference implementation of all rules.ResultsOur formalization is a promising complement to the standard, in particular because all rules have been extensively verified and because conceptual validation is facilitated (the informal semantics also describes in-place updates).ConclusionSince our formalization has already revealed problems with the standard and since the BPMN is still evolving, the maintainers of the standard can benefit from our results. Moreover, tool vendors can use our formalization and reference implementation for verifying conformance to the standard.  相似文献   

12.
There are two approaches to formalizing the syntax of typed object languages in a proof assistant or programming language. The extrinsic approach is to first define a type that encodes untyped object expressions and then make a separate definition of typing judgements over the untyped terms. The intrinsic approach is to make a single definition that captures well-typed object expressions, so ill-typed expressions cannot even be expressed. Intrinsic encodings are attractive and naturally enforce the requirement that metalanguage operations on object expressions, such as substitution, respect object types. The price is that the metalanguage types of intrinsic encodings and operations involve non-trivial dependency, adding significant complexity. This paper describes intrinsic-style formalizations of both simply-typed and polymorphic languages, and basic syntactic operations thereon, in the Coq proof assistant. The Coq types encoding object-level variables (de Bruijn indices) and terms are indexed by both type and typing environment. One key construction is the boot-strapping of definitions and lemmas about the action of substitutions in terms of similar ones for a simpler notion of renamings. In the simply-typed case, this yields definitions that are free of any use of type equality coercions. In the polymorphic case, some substitution operations do still require type coercions, which we at least partially tame by uniform use of heterogeneous equality.  相似文献   

13.
Formal verification methods have gained increased importance due to their ability to guarantee system correctness and improve reliability. Nevertheless, the question how proofs are to be formalized in theorem provers is far from being trivial, yet very important as one needs to spend much more time on verification if the formalization was not cleverly chosen. In this paper, we develop and compare two different possibilities to express coinductive proofs in the theorem prover Isabelle/HOL. Coinduction is a proof method that allows for the verification of properties of also non-terminating state-transition systems. Since coinduction is not as widely used as other proof techniques as e.g. induction, there are much fewer “recipes” available how to formalize corresponding proofs and there are also fewer proof strategies implemented in theorem provers for coinduction. In this paper, we investigate formalizations for coinductive proofs of properties on state transition sequences. In particular, we compare two different possibilities for their formalization and show their equivalence. The first of these two formalizations captures the mathematical intuition, while the second can be used more easily in a theorem prover. We have formally verified the equivalence of these criteria in Isabelle/HOL, thus establishing a coalgebraic verification framework. To demonstrate that our verification framework is suitable for the verification of compiler optimizations, we have introduced three different, rather simple transformations that capture typical problems in the verification of optimizing compilers, even for non-terminating source programs.  相似文献   

14.
We present a decision procedure that combines reasoning about datatypes and codatatypes. The dual of the acyclicity rule for datatypes is a uniqueness rule that identifies observationally equal codatatype values, including cyclic values. The procedure decides universal problems and is composable via the Nelson–Oppen method. It has been implemented in CVC4, a state-of-the-art SMT solver. An evaluation based on problems generated from formalizations developed with Isabelle demonstrates the potential of the procedure.  相似文献   

15.
The inductive assertion method is generalized to permit formal, machine-verifiable proofs of correctness for multiprocess programs. Individual processes are represented by ordinary flowcharts, and no special synchronization mechanisms are assumed, so the method can be applied to a large class of multiprocess programs. A correctness proof can be designed together with the program by a hierarchical process of stepwise refinement, making the method practical for larger programs. The resulting proofs tend to be natural formalizations of the informal proofs that are now used.  相似文献   

16.
To enhance the expressive power and the declarative abiltity of a deductive database,various CWA(Closed World Assumption)formalizations including the naive CWA,the generalized CWA and the careful CWA are extended to multi-valued logics.The basic idea is to embed logic formulas into some polynomialring,The extensiona can be applied in a uniform manner to any finitely multi-valued logics,Therefore they also of computational significance.  相似文献   

17.
This article studies the problem of minimality and identifiability for switched autoregressive exogenous (SARX) systems. We propose formal definitions of the concepts of identifiability and minimality for SARX models. Based on these formalizations, we derive conditions for minimality and identifiability of SARX systems. In particular, we show that polynomially parameterized SARX systems are generically identifiable.  相似文献   

18.
Commitments are being used to specify interactions among autonomous agents in multiagent systems. Various formalizations of commitments have shown their strength in representing and reasoning on multiagent interactions. These formalizations mostly study commitment lifecycles, emphasizing fulfillment of a single commitment. However, when multiple commitments coexist, fulfillment of one commitment may have an effect on the lifecycle of other commitments. Since agents generally participate in more than one commitment at a time, it is important for an agent to determine whether it can honor its commitments. These commitments may be the existing commitments of the agent as well as any prospective commitments that the agent plans to participate in. To address this, we develop the concept of commitment feasibility, i.e., whether it is possible for an agent to fulfill a set of commitments all together. To achieve this we generalize the fulfillment of a single commitment to the feasibility of a set of commitments. We then develop a solid method to determine commitment feasibility. Our method is based on the transformation of feasibility into a constraint satisfaction problem and use of constraint satisfaction techniques to come up with a conclusion. We show soundness and completeness of our method and illustrate its applicability over realistic cases.  相似文献   

19.
20.
Although many techniques for merging conflicting propositional knowledge bases have already been proposed, most existing work is based on the idea that inconsistency results from the presence of incorrect pieces of information, which should be identified and removed. In contrast, we take the view in this paper that conflicts are often caused by statements that are inaccurate rather than completely false, suggesting to restore consistency by interpreting certain statements in a flexible way, rather than ignoring them completely. In accordance with this view, we propose a novel approach to merging which exploits extra-logical background information about the semantic relatedness of atomic propositions. Several merging operators are presented, which are based on different formalizations of this background knowledge, ranging from purely qualitative approaches, related to possibilistic logic, to quantitative approaches with a probabilistic flavor. Both syntactic and semantic characterizations are provided for each merging operator, and the computational complexity is analyzed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号