首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Over the last decade substantial advances have been made in the use of causal pathophysiological knowledge in artificial intelligence-based programs for medical diagnosis. Various forms of causal representations have been used. They include probabilistic models, quantitative models, qualitative models, and models that describe causal relations at multiple levels of detail. This paper briefly analyses these methods using three representative systems. Outstanding problems and possible direction in further exploitation of causal reasoning for medical decision-support systems are also discussed.  相似文献   

2.
3.
Up to now,there have many methods for knowledge representation and reasoning in causal networks,but few of them include the research on the coactions of nodes.In practice,ignoring these coactions may influence the accureacy of reasoning and even give rise to incorrect reasoning.In this paper,based on multilayer causal networks.the definitions on coaction nodes are given to construct a new causal network called Coaction Causal Network,which serves to construct a model of nerual network for diagnosis followed by fuzzy reasoning,and then the activation rules are given and neural computing methods are used to finish the diagnostic reasoning,These methods are proved in theory and a method of computing the number of solutions for the diagnostic reasoning is given.Finally,the experiments and the conclusions are presented.  相似文献   

4.
This article describes LIBRA/Dx, a competition-based parallel activation model for diagnostic reasoning. Within a causal network, the model uses a neurally inspired processing paradigm to generate the most plausible explanation for a set of observed manifestations. the model was built using LIBRA: a domain-independent parallel activation network generator, that can be used to build network models with processing paradigms that are tailored to the specifics of an application domain. the underlying theory postulates that by simultaneously satisfying multiple constraints that may exist locally among domain concepts in a causal network (e.g., among disorders, syndromes, manifestations, etc.) it is possible to construct a plausible global explanation for a set of observed signs and symptoms. the proposed processing paradigm which uses an associative network of concepts to represent domain knowledge, lends itself to the kind of interactive processing that is necessary to capture the generative capacity of human diagnostic ability in novel situations. LIBRA/Dx offers a new approach to modeling a higher cognitive process: diagnostic reasoning, specifically in terms of the time-course of processing and the nature of knowledge representation. It further contributes to our current understanding of the phenomena of human cognition, which have eluded successful explication in conventional computational formalisms.  相似文献   

5.
This paper proposes a fuzzy dependence-index for construction of the probabilistic models considering dependent relation for solving the reasoning problem. It is important for constructing the joint probability-distribution to consider the dependency of events. We consider that some vagueness is included in the dependency. Because causal relationship of among events is uncertain, it is difficult to express dependency as definite value. In this paper, we classify the dependent relations, and apply the fuzzy probability to calculation of the dependence-index. Then, the fuzzy dependence-index is defined to consider dependency with fuzziness. Using the fuzzy dependence-index, we calculate the joint probability of multi-events for constructing the probabilistic model. This work was presented in part at the 13th International Symposium on Artificial Life and Robotics, Oita, Japan, January 31–February 2, 2008  相似文献   

6.
The complexity of technical systems requires increasingly advanced fault diagnosis methods to ensure safety and reliability during operation. Particularly in domains where maintenance constitutes an extensive portion of the entire operation cost, efficient and effective failure identification holds the potential to provide large economic value. Abduction offers an intuitive concept for diagnostic reasoning relying on the notion of logical entailment. Nevertheless, abductive reasoning is an intractable problem and computing solutions for instances of reasonable size and complexity persists to pose a challenge. In this paper, we investigate algorithm selection as a mechanism to predict the “best” performing technique for a specific abduction scenario within the framework of model-based diagnosis. Based on a set of structural attributes extracted from the system models, our meta-approach trains a machine learning classifier that forecasts the most runtime efficient abduction technique given a new diagnosis problem. To assess the predictor’s selection capabilities and the suitability of the meta-approach in general, we conducted an empirical analysis featuring seven abductive reasoning approaches. The results obtained indicate that applying algorithm selection is competitive in comparison to always choosing a single abductive reasoning method.  相似文献   

7.
8.
Some of the most popular approaches to model-based diagnosis consist of reasoning about a model of the behaviour of the system to be diagnosed by considering a set of observations about such a system and by explaining it in terms of a set of initial causes. This process has been widely modeled via logical formalisms essentially taking into account declarative aspects. In this paper, a new approach is proposed, where the diagnostic process is captured within a framework based on the formalism of Petri nets. We introduce a particular net model, called Behavioral Petri Net (BPN), We show how the formalization of the diagnostic process can be obtained in terms of reachability in a BPN and can be implemented by exploiting classical analysis techniques of Petri nets like reachability graph analysis and P-invariant computation. Advantages of the proposed methods, like suitability to parallel processing and exploitation of linear algebra techniques, are then pointed out.  相似文献   

9.
This paper reports on a formative evaluation of the diagnostic capabilities of the Heart Failure Program, which uses a probability network and a heuristic hypothesis generator. Using 242 cardiac cases collected from discharge summaries at a tertiary care hospital, we compared the diagnoses of the program to diagnoses collected from cardiologists using the same information as was available to the program. With some adjustments to the knowledge base, the Heart Failure Program produces appropriate diagnoses about 90% of the time on this training set. The main reasons for the inappropriate diagnoses of the remaining 10% include inadequate reasoning with temporal relations between cause and effect, severity relations, and independence of acute and chronic diseases.  相似文献   

10.
Requirements Engineering - In early-phase requirements engineering, modeling stakeholder goals and intentions helps stakeholders understand the problem context and evaluate tradeoffs, by exploring...  相似文献   

11.
Xiaohai  Dominik  Bernhard   《Neurocomputing》2008,71(7-9):1248-1256
We propose a method to quantify the complexity of conditional probability measures by a Hilbert space seminorm of the logarithm of its density. The concept of reproducing kernel Hilbert spaces (RKHSs) is a flexible tool to define such a seminorm by choosing an appropriate kernel. We present several examples with artificial data sets where our kernel-based complexity measure is consistent with our intuitive understanding of complexity of densities.

The intention behind the complexity measure is to provide a new approach to inferring causal directions. The idea is that the factorization of the joint probability measure P(effect,cause) into P(effect|cause)P(cause) leads typically to “simpler” and “smoother” terms than the factorization into P(cause|effect)P(effect). Since the conventional constraint-based approach of causal discovery is not able to determine the causal direction between only two variables, our inference principle can in particular be useful when combined with other existing methods.

We provide several simple examples with real-world data where the true causal directions indeed lead to simpler (conditional) densities.  相似文献   


12.
13.
We introduce some ideal from the theory of approximate reasoning and from possibility theory based on fuzzy sets. We shown how these ideas can form the basis for building classification models which enable one to use imprecise information in their construction.  相似文献   

14.
In this paper, the concept of causal relevance (CR) is introduced in the context of the fuzzy inductive reasoning (FIR) modelling and simulation methodology. The idea behind CR is to quantify how much influence each system variable has, from the spatial and temporal points of view, on the prediction of the output. This paper introduces the FIR inference engine, and describes how it can be improved by means of the CR concept, helping to reduce uncertainty during the forecasting stage. The FIR inference engine is based on the k-nearest neighbour classification rule, commonly used in the field of pattern recognition, and uses a Euclidean distance measure to compute the distance between neighbours. In this paper, a weight-Euclidean distance measure is proposed that is able to find better quality neighbours by using the CR concept. Applications from different fields are studied in the light of the prediction process, and a comparison between the accuracy of the predictions obtained when using the classical inference engine and the CR option is performed. The results obtained from this research show that FIR predictions are more accurate and precise when the CR option is used, especially for systems where classical FIR forecasting performs rather poorly.  相似文献   

15.
SysML is a variant of UML for systems design. Several formalisations of SysML (and UML) are available. Our work is distinctive in two ways: a semantics for refinement and for a representative collection of elements from the UML4SysML profile (blocks, state machines, activities, and interactions) used in combination. We provide a means to analyse and refine design models specified using SysML. This facilitates the discovery of problems earlier in the system development lifecycle, reducing time, and costs of production. Here, we describe our semantics, which is defined using a state-rich process algebra and implemented in a tool for automatic generation of formal models. We also show how the semantics can be used for refinement-based analysis and development. Our case study is a leadership-election protocol, a critical component of an industrial application. Our major contribution is a framework for reasoning using refinement about systems specified by collections of SysML diagrams.  相似文献   

16.
Discusses an automatic feature recognizer that decomposes the total volume to be machined into volumetric features that satisfy stringent conditions for manufacturability, and correspond to operations typically performed in 3-axis machining centers. Unlike most of the previous research, the approach is based on general techniques for dealing with features with intersecting volumes. Feature interactions are represented explicitly in the recognizer's output, to facilitate spatial reasoning in subsequent planning stages. A generate-and-test strategy is used. OPS-5 production rules generate hints or clues for the existence of features, and post them on a blackboard. The clues are assessed, and those judged promising are processed to ensure that they correspond to actual features, and to gather information for process planning. Computational geometry techniques are used to produce the largest volumetric feature compatible with the available data. The feature's accessibility, and its interactions with others are analyzed. The validity tests ensure that the proposed features are accessible, do not intrude into the desired part, and satisfy other machinability conditions. The process continues until it produces a complete decomposition of the volume to be machined into fully-specified features  相似文献   

17.
为在基于隐变量模型的因果关系发现算法中综合考虑隐变量之间的瞬时性和延时性因果效应,构建以动态贝叶斯网络为基础的时序隐变量模型,提出对应的因果关系发现算法。使用因子分析的方法估计测量模型中的因子载荷矩阵,应用结构向量自回归模型估计自回归矩阵,利用数据的非高斯性依次学习模型中隐变量之间的瞬时效应矩阵与延时效应矩阵,构建时序隐变量模型的因果网络结构。实验结果验证了算法的有效性。  相似文献   

18.
This study proposes an optimization model for optimal treatment of bacterial infections. Using an influence diagram as the knowledge and decision model, we can conduct two kinds of reasoning simultaneously: diagnostic reasoning and treatment planning. The input information of the reasoning system are conditional probability distributions of the network model, the costs of the candidate antibiotic treatments, the expected effects of the treatments, and extra constraints regarding belief propagation. Since the prevalence of the pathogens and infections are determined by many site-by-site factors, which are not compliant with conventional approaches for approximate reasoning, we introduce fuzzy information. The output results of the reasoning model are the likelihood of a bacterial infection, the most likely pathogen(s), the suggestion of optimal treatment, the gain of life expectancy for the patient related to the optimal treatment, the probability of coverage associated with the antibiotic treatment, and the cost-effect analysis of the treatment prescribed.  相似文献   

19.
《Artificial Intelligence》1987,32(2):245-257
Stochastic simulation is a method of computing probabilities by recording the fraction of time that events occur in a random series of scenarios generated from some causal model. This paper presents an efficient, concurrent method of conducting the simulation which guarantees that all generated scenarios will be consistent with the observed data. It is shown that the simulation can be performed by purely local computations, involving products of parameters given with the initial specification of the model. Thus, the method proposed renders stochastic simulation a powerful technique of coherent inferencing, especially suited for tasks involving complex, nondecomposable models where “ballpark” estimates of probabilities will suffice.  相似文献   

20.
As software applications become highly interconnected in dynamically provisioned platforms, they form the so-called systems-of-systems. Therefore, a key issue that arises in such environments is whether specific requirements are violated, when these applications interact in new unforeseen ways as new resources or system components are dynamically provisioned. Such environments require the continuous use of frameworks for assessing compliance against specific mission critical system requirements. Such frameworks should be able to (a) handle large requirements models, (b) assess system compliance repeatedly and frequently using events from possibly high velocity and high frequency data streams, and (c) use models that can reflect the vagueness that inherently exists in big data event collection and in modeling dependencies between components of complex and dynamically re-configured systems. In this paper, we introduce a framework for run time reasoning over medium and large-scale fuzzy goal models, and we propose a process which allows for the parallel evaluation of such models. The approach has been evaluated for time and space performance on large goal models, exhibiting that in a simulation environment, the parallel reasoning process offers significant performance improvement over a sequential one.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号