首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 256 毫秒
1.
Logic programming provides a model for rule-based reasoning in expert systems. The advantage of this formal model is that it makes available many results from the semantics and proof theory of first-ordet predicate logic. A disadvantage is that in expert systems one often wants to use, instead of the usual two truth values, an entire continuum of “uncertainties” in between. That is, instead of the usual “qualitative” deduction, a form of “quantitative” deduction is required. We present an approach to generalizing the Tarskian semantics of Horn clause rules to justify a form of quantitative deduction. Each clause receives a numerical attenuation factor. Herbrand interpretations, which are subsets of the Herbrand base, are generalized to subsets which are fuzzy in the sense of Zadeh. We show that as result the fixpoint method in the semantics of Horn clause rules can be developed in much the same way for the quantitative case. As for proof theory, the interesting phenomenon is that a proof should be viewed as a two-person game. The value of the game turns out to be the truth value of the atomic formula to be proved, evaluated in the minimal fixpoint of the rule set. The analog of the PROLOG interpreter for quantitative deduction becomes a search of the game tree ( = proof tree) using the alpha-beta heuristic well known in game theory.  相似文献   

2.
在智能通信网的学习过程中,针对传统归纳演绎学习的困难,提出一种逆演绎的学习算法。从学习任务和逻辑程序角度出发,研究逆演绎学习算法的规则、霍恩状态函子和激励信息等问题,并利用逆演绎的学习算法实现了智能通信网的学习推理系统。测试结果表明,该算法能高效地归纳出逻辑程序。  相似文献   

3.
Several artificial intelligence architectures and systems based on “deep” models of a domain have been proposed, in particular for the diagnostic task. These systems have several advantages over traditional knowledge based systems, but they have a main limitation in their computational complexity. One of the ways to face this problem is to rely on a knowledge compilation phase, which produces knowledge that can be used more effectively with respect to the original one. We show how a specific knowledge compilation approach can focus reasoning in abductive diagnosis, and, in particular, can improve the performances of AID, an abductive diagnosis system. The approach aims at focusing the overall diagnostic cycle in two interdependent ways: avoiding the generation of candidate solutions to be discarded a posteriori and integrating the generation of candidate solutions with discrimination among different candidates. Knowledge compilation is used off-line to produce operational (i.e., easily evaluated) conditions that embed the abductive reasoning strategy and are used in addition to the original model, with the goal of ruling out parts of the search space or focusing on parts of it. The conditions are useful to solve most cases using less time for computing the same solutions, yet preserving all the power of the model-based system for dealing with multiple faults and explaining the solutions. Experimental results showing the advantages of the approach are presented  相似文献   

4.
5.
余泉  李承乾  申宇铭  王驹 《软件学报》2015,26(8):1937-1945
溯因推理为归纳与演绎推理之外的另一种重要的推理形式,在人工智能等领域有着广泛的应用.通俗地讲,溯因推理是从观察(结果)去推断原因的推理过程.不同于以往的研究思路,通过使用本原蕴含式和素蕴含,证明了可以把命题逻辑和命题模态逻辑系统S5中求溯因问题的极小解释转化为求对应集合的极小碰集问题.给出了求解溯因问题的一种新方法.  相似文献   

6.
An expert system applies the deduction rules in its knowledge base to a set of initial data to reach a conclusion. When the initial data are insufficient, the expert system may ask the user for additional information. This paper analyzes effectiveness and efficiency of question-asking strategies in expert systems with Horn clause knowledge bases. An effective strategy reaches a conclusion after asking as few questions as possible. An efficient strategy can be computed quickly. We prove that effective strategies are, unfortunately, not efficient. However, we present a somewhat less effective but very efficient strategy. It employs an algorithm which simultaneously performs deduction and question selection in log-linear time.Supported in part by NSF grant DMS-8513970.  相似文献   

7.
The problem of deciding injectivity of functions is addressed. The functions under consideration are compositions of more basic functions for which information about injectivity properties is available. We present an algorithm which will often be able to prove that such a composite function is injective. This algorithm constructs a set of propositional Horn clause axioms from the function specification and the available information about the basic functions. The existence of a proof of injectivity is then reduced to the problem of propositional Horn clause deduction. Dowling and Gallier have designed several very fast algorithms for this problem, the efficiency of which our algorithm inherits. The proof of correctness of the algorithm amounts to showing soundness and completeness of the generated Horn clause axioms.  相似文献   

8.
This paper presents a generalization of Shapiro style algorithmic debugging for generalized Horn clause intuitionistic logic. This logic offers hypothetical reasoning and negation is defined not by failure but by inconsistency. We extend Shapiro's notion of intended interpretation, symptoms and errors and give formal results paralleling those known for definite clauses. We also show how a corresponding diagnosis module for RISC- a logic programming system for generalized Horn clause intuitionistic logic-can be defined by meta interpretation. In contrast to Shapiro's PROLOG modules ours work independently of the specific computation rule that in RISC may be specified by the user.  相似文献   

9.
This article introduces abductive case‐based reasoning (CBR) and attempts to show that abductive CBR and deductive CBR can be integrated in clinical process and problem solving. Then it provides a unified formalization for integration of abduction, abductive CBR, deduction, and deductive CBR. This article also investigates abductive case retrieval and deductive case retrieval using similarity relations, fuzzy similarity relations, and similarity metrics. The proposed approach demonstrates that the integration of deductive CBR and abductive CBR is of practical significance in problem solving such as system diagnosis and analysis, and will facilitate research of abductive CBR and deductive CBR. © 2005 Wiley Periodicals, Inc. Int J Int Syst 20: 957–983, 2005.  相似文献   

10.
A predicate/transition net model for a subset of Horn clause logic programs is presented. The syntax, transformation procedure, semantics, and deduction process for the net model are discussed. A possible parallel implementation for the net model is described, which is based on the concepts of communicating processes and relations. The proposed net model offers a syntactical variant of Horn clause logic and has two distinctions from other existing schemes for the logic programs: representation formalism and the deduction method. The net model provides an approach towards the solutions of the separation of logic from control and the improvement of the execution efficiency through parallel processing for the logic programs. The abstract nature of the net model also lends itself to different implementation strategies.<>  相似文献   

11.
This paper extends the logical inference of Horn clauses in Petri net models to cover a large class of non-Horn clauses. Based on four-valued logic and the conflict transition concept, we show how the Petri net model for this class of non-Horn clauses can be constructed. The clause inference is solved by the T-invariant method or the fixpoint of markings. Both forward and backward inference can be used in our model. It is further shown that these techniques are efficient for the common classes of monotonic reasoning. © 1998 John Wiley & Sons, Inc.  相似文献   

12.
In this paper we provide an overview of a number of fundamental reasoning formalisms in artificial intelligence which can and have been used in modelling legal reasoning. We describe deduction, induction and analogical reasoning formalisms, and show how they can be used separately to model legal reasoning. We argue that these formalisms can be used together to model legal reasoning more accurately, and describe a number of attempts to integrate the approaches.  相似文献   

13.
Formal logical tools are able to provide some amount of reasoning support for information analysis, but are unable to represent uncertainty. Bayesian network tools represent probabilistic and causal information, but in the worst case scale as poorly as some formal logical systems and require specialized expertise to use effectively. We describe a framework for systems that incorporate the advantages of both Bayesian and logical systems. We define a formalism for the conversion of automatically generated natural deduction proof trees into Bayesian networks. We then demonstrate that the merging of such networks with domain-specific causal models forms a consistent Bayesian network with correct values for the formulas derived in the proof. In particular, we show that hard evidential updates in which the premises of a proof are found to be true force the conclusions of the proof to be true with probability one, regardless of any dependencies and prior probability values assumed for the causal model. We provide several examples that demonstrate the generality of the natural deduction system by using inference schemes not supportable directly in Horn clause logic. We compare our approach to other ones, including some that use non-standard logics.  相似文献   

14.
The formalism of nonmonotonic reasoning has been integrated into logic programming to define semantics for logic program with negation. Because a Petri net provides a uniform model for both the logic of knowledge and the control of inference, the class of high-level Petri nets called predicate/transition nets (PrT-nets) has been employed to study production rule based expert systems and Horn clause logic programs. We show that a PrT-net can implement the nonmonotonicity associated with a logic program with negation as well as the monotonicity of Horn clause logic program. In particular, we define a semantics for a normal logic program and implement it with PrT-net. We demonstrate that in the presence of inconsistency in a normal logic program, the semantics still works well by deducing meaningful answers. The variations and potential applications of the PrT-net are also addressed  相似文献   

15.
为发现Web使用记录中所蕴涵的用户访问模式,在深入分析日志本体中事件间的抽象关系后,提出适用于原子事件和复合事件间整分关系推理的ALC传播规则扩展已有的推理模式,并在此基础上提出一种挖掘日志本体的ILP方法。该方法结合描述逻辑和Horn规则在知识表示和推理过程中互补的特点,采用ALlog混合系统构建知识库,利用约束SLD反驳消解和扩展ALC传播规则从日志本体中学习用户访问模式,达到站点商业智能和个性化的目的。最后给出验证该方法的实例,实验结果表明了该方法的可行性和有效性。  相似文献   

16.
A Fuzzy Proof Theory   总被引:1,自引:0,他引:1       下载免费PDF全文
Based on the first order peicate logic,in this paper,we present a new approach to generalizing the syntax of ordinary Horn clause rules to establish a fuzzy proof theory,First of all,each Horn clause rule is associated with a numerical implication strength f.Therefore we obtain f-Horn clause rules.Secondly,Herbrand interpretations can be generalized to fuzzy subsets of the Herbrand base in the sense of Zadch.As a result the proof theory for Horn clause rules can be developed in much the same way for f-Horm clause rules.  相似文献   

17.
18.
19.
When Horn clause theories are combined with integrity constraints to produce potentially refutable theories, Seki and Takeuchi have shown how crucial literals can be used to discriminate two mutually incompatible theories. A literal is crucial with respect to two theories if only one of the two theories supports the derivation of that literal. In other words, actually determining the truth value of the crucial literal will refute one of the two incompatible theories. This paper presents an integration of the idea of crucial literal with Theorist, a logic-based system for hypothetical reasoning. Theorist is a goal-directed nonmonotonic reasoning system that classifies logical formulas as possible hypotheses, facts, and observations. As Theorist uses full clausal logic, it does not require Seki and Takeuchi's notion of integrity constraint to define refutable theories. In attempting to deduce observation sentences, Theorist identifies instances of possible hypotheses as nomological explanations: consistent sets of hypothesis instances required to deduce observations. As multiple and mutually incompatible explanations are possible, the notion of crucial literal provides the basis for proposing experiments that distinguish competing explanations. We attempt to make three contributions. First, we adapt Seki and Takeuchi's method for Theorist. To do so, we incrementally use crucial literals as experiments, whose results are used to reduce the total number of explanations generated for a given set of observations. Next, we specify an extension which incrementally constructs a table of all possible crucial literals for any pair of theories. This extension is more efficient and provides the user with greater opportunity to conduct experiments to eliminate falsifiable theories. A prototype is implemented in CProlog, and several examples of diagnosis are considered to show its empirical efficiency. Finally, we point out that assumption-based truth maintenance systems (ATMS), as used in the multiple fault diagnosis system of de Kleer and Williams, are interesting special cases of this more general method of distinguishing explanatory theories.  相似文献   

20.
基于限定的溯因问题求解   总被引:2,自引:0,他引:2  
陈保平  孙吉贵 《软件学报》1997,8(4):316-320
溯因问题是人工智能中的一个重要研究方向,它在许多领域中有着广泛的应用,但在很多情形下,溯因解释的求取是非常困难的.本文提供一种基于限定理论的溯因解释求法,对于满足完备性公理和正原因假设的理论,可以证明其限定中的相容解释就是溯因解释,并且对Horn子句集给出具体的求解算法.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号