首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 750 毫秒
1.
A major bottleneck in developing knowledge-based systems is the acquisition of knowledge. Machine learning is an area concerned with the automation of this process of knowledge acquisition. Neural networks generally represent their knowledge at the lower level, while knowledge-based systems use higher-level knowledge representations. the method we propose here provides a technique that automatically allows us to extract conjunctive rules from the lower-level representation used by neural networks, the strength of neural networks in dealing with noise has enabled us to produce correct rules in a noisy domain. Thus we propose a method that uses neural networks as the basis for the automation of knowledge acquisition and can be applied to noisy, realworld domains. © 1993 John Wiley & Sons, Inc.  相似文献   

2.
Machine Learning is an area concerned with the automation of the process of knowledge acquisition. Neural networks generally represent their knowledge at the lower level, while knowledge based systems use higher level knowledge representations. The method we propose here, provides a technique which automatically allows us to extract production rules from the lower level representation used by a single-layered neural networks trained by Hebb's rule. Even though a single-layered neural network can not model complex, nonlinear domains, their strength in dealing with noise has enabled us to produce correct rules in a noisy domain.  相似文献   

3.
The application of expert systems to various problem domains in business has grown steadily since their introduction. Regardless of the chosen method of development, the most commonly cited problems in developing these systems are the unavailability of both the experts and knowledge engineers and difficulties with the process of acquiring knowledge from domain experts. Within the field of artificial intelligence, this has been called the 'knowledge acquisition' problem and has been identified as the greatest bottleneck in the expert system development process. Simply stated, the problem is how to acquire the specific knowledge for a well-defined problem domain efficiently from one or more experts and represent it in the appropriate computer format. Given the 'paradox of expertise', the experts have often proceduralized their knowledge to the point that they have difficulty in explaining exactly what they know and how they know it. However, empirical research in the field of expert systems reveals that certain knowledge acquisition techniques are significantly more efficient than others in helping to extract certain types of knowledge within specific problem domains. In this paper we present a mapping between these empirical studies and a generic taxonomy of expert system problem domains. In so doing, certain knowledge acquisition techniques can be prescribed based on the problem domain characteristics. With the production and operations management (P/OM) field as the pilot area for the current study, we first examine the range of problem domains and suggest a mapping of P/OM tasks to a generic taxonomy of problem domains. We then describe the most prominent knowledge acquisition techniques. Based on the examination of the existing empirical knowledge acquisition research, we present how the empirical work can be used to provide guidance to developers of expert systems in the field of P/OM.  相似文献   

4.
Most existing expert systems are defined in structured task domains. However, many real-life decision tasks are novel, unstructured and consequential. To support these tasks, expert systems are needed which provide an integrated environment capable of capturing new knowledge by updating the existing knowledge base. This paper describes the incremental development process of an expert system, from the initial gathering of data up to the development of knowledge acquisition tools and knowledge integration methodologies. The expert system developed addresses managerial planning tasks of Greek small-to-medium sized enterprises (SMEs). The manager sets values for parameters specifying environmental and company characteristics. The expert system responds with suggestions on feasible tactics, objectives and strategies. To cope with the changes of planning situations and also to improve the integrity of the knowledge base as the manager gains experience, knowledge acquisition tools have been introduced. These knowledge acquisition tools, which are manipulated directly by the manager, provide the system with additional knowledge and validate the knowledge already embedded in the knowledge base.  相似文献   

5.
Bayesian Networks have been proposed as an alternative to rule-based systems in domains with uncertainty. Applications in monitoring and control can benefit from this form of knowledge representation. Following the work of Chong and Walley, we explore the possibilities of Bayesian Networks in the Waste Water Treatment Plants (WWTP) monitoring and control domain. We show the advantages of modelling knowledge in such a domain by means of Bayesian networks, put forth new methods for knowledge acquisition, describe their applications to a real waste water treatment plant and comment on the results. We also show how a Bayesian Network learning environment was used in the process and which characteristics of data in the domain suggested new ways of representing knowledge in network form but with uncertainty representations formalisms other than probability. The results of applying a possibilistic extension of current learning methods are also shown and compared.  相似文献   

6.
In order to solve a complicated problem one must use the knowledge from different domains. Therefore, if we want to automatize the solution of these problems, we have to help the knowledge-based systems that correspond to these domains cooperate, that is. communicate facts and conclusions to each other in the process of decision making. One of the main obstacles to such cooperation is the fact that different intelligent systems use differenl methods of knowledge acquisition and different methods and formalisms for uncertainty representation. So we need an interface f, “translating” the values x, y, which represent uncertainly of the experts’ knowledge in one system, into the values f(x), f(y) appropriate for another one.

In the present report we formulate the problem of designing such an interface as a mathematical problem, and solve it. We show that the interface must be fractionally linear: f(x) = (ax + b)/(cx + d).  相似文献   

7.
This paper is in a form unconventional in modern journals but traditional for the discussion of foundational questions: a dialogue. It is a form that makes it possible to contrast two deeply held but incompatible views, each with its standard forms of defence, in order to seek common ground and make the differences more precise. In artificial intelligence, or at least in the major part of it still committed to symbolic representations, there is a long history of discussion of the origin and nature of the symbols we use in representations, symbols which normally look like words, English words in fact, but which most researchers deny are such words, since to concede that would put in question the abstract nature of the representation. In what follows, we examine our common ground and then diverge over five specific questions on the issue of representations. The discussion focuses on symbol use in representations of language, because there the similarity is most acute—between the representation and the represented—but the issues are general and apply to symbolic AI as such.  相似文献   

8.
Cyber–physical systems are becoming increasingly complex. In these advanced systems, the different engineering domains involved in the design process become more and more intertwined. Therefore, a traditional (sequential) design process becomes inefficient in finding good design options. Instead, an integrated approach is needed where parameters in multiple different engineering domains can be chosen, evaluated, and optimized to achieve a good overall solution. However, in such an approach, the combined design space becomes vast. As such, methods are needed to mitigate this problem.In this paper, we show a method for systematically capturing and updating domain knowledge in the context of a co-design process involving different engineering domains, i.e. control and embedded. We rely on ontologies to reason about the relationships between parameters in the different domains. This allows us to derive a stepwise design space exploration workflow where this domain knowledge is used to quickly reduce the design space to a subset of likely good candidates. We illustrate our approach by applying it to the design space exploration process for an advanced electric motor control system and its deployment on embedded hardware.  相似文献   

9.
The main goal of this paper is to illustrate applications of some recent developments in the theory of logic programming to knowledge representation and reasoning in common sense domains. We are especially interested in better understanding the process of development of such representations together with their specifications. We build on the previous work of Gelfond and Przymusinska in which the authors suggest that, at least in some cases, a formal specification of the domain can be obtained from specifications of its parts by applying certain operators on specifications called specification constructors and that a better understanding of these operators can substantially facilitate the programming process by providing the programmer with a useful heuristic guidance. We discuss some of these specification constructors and their realization theorems which allow us to transform specifications built by applying these constructors to declarative logic programs. Proofs of two such theorems, previously announced in a paper by Gelfond and Gabaldon, appear here for the first time. The method of specifying knowledge representation problems via specification constructors and of using these specifications for the development of their logic programming representations is illustrated by design of a simple, but fairly powerful program representing simple hierarchical domains. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

10.
Refinement-closed security properties allow the verification of systems for all possible implementations. Some systems, however, have refinements that do not represent possible implementations. In particular, real instantiations of abstract systems comprising security-critical components surrounded by maximally hostile unrefined components are often characterised only by compositions of refinements of the abstract system's components, rather than all refinements of the abstract system. In this case, refinement-closed security properties that examine multiple behaviours of a system at once can be falsely violated by the presence of inconsistent pairs of behaviour arising from different, incompatible refinements of the system's components.We show how to weaken a class of such properties, which includes both information flow and causation properties, to allow them to be applied to these sorts of abstract systems. The weakened properties ignore all pairs of inconsistent behaviour that would have violated the original property from which they are derived. We also show how to adapt existing automated tests for these properties to allow them to be used to test for their weakened counterparts instead. This enables greater flexibility in the application of these sorts of properties to compositions of nondeterministic components.  相似文献   

11.
Adjectives are common in natural language, and their usage and semantics have been studied broadly. In recent years, with the rapid growth of knowledge bases (KBs), many knowledge-based question answering (KBQA) systems are developed to answer users’ natural language questions over KBs. A fundamental task of such systems is to transform natural language questions into structural queries, e.g., SPARQL queries. Thus, such systems require knowledge about how natural language expressions are represented in KBs, including adjectives. In this paper, we specifically address the problem of representing adjectives over KBs. We propose a novel approach, called Adj2SP, to represent adjectives as SPARQL query patterns. Adj2SP contains a statistic-based approach and a neural network-based approach, both of them can effectively reduce the search space for adjective representations and overcome the lexical gap between input adjectives and their target representations. Two adjective representation datasets are built for evaluation, with adjectives used in QALD and Yahoo! Answers, as well as their representations over DBpedia. Experimental results show that Adj2SP can generate representations of high quality and significantly outperform several alternative approaches in F1-score. Furthermore, we publish Lark, a lexicon for adjective representations over KBs. Current KBQA systems show an improvement of over 24% in F1-score by integrating Adj2SP.  相似文献   

12.
Chalak K  White H 《Neural computation》2012,24(7):1611-1668
We study the connections between causal relations and conditional independence within the settable systems extension of the Pearl causal model (PCM). Our analysis clearly distinguishes between causal notions and probabilistic notions, and it does not formally rely on graphical representations. As a foundation, we provide definitions in terms of suitable functional dependence for direct causality and for indirect and total causality via and exclusive of a set of variables. Based on these foundations, we provide causal and stochastic conditions formally characterizing conditional dependence among random vectors of interest in structural systems by stating and proving the conditional Reichenbach principle of common cause, obtaining the classical Reichenbach principle as a corollary. We apply the conditional Reichenbach principle to show that the useful tools of d-separation and D-separation can be employed to establish conditional independence within suitably restricted settable systems analogous to Markovian PCMs.  相似文献   

13.
OBJECTIVES: The goal of this article is to identify some of the major trends and findings in expertise research and their connections to human factors. BACKGROUND: Progress in the study of superior human performance has come from improved methods of measuring expertise and the development of better tools for revealing the mechanisms that support expert performance, such as protocol analysis and eye tracking. METHODS: We review some of the challenges of capturing superior human performance in the laboratory and the means by which the expert performance approach may overcome such challenges. We then discuss applications of the expert performance approach to a handful of domains that have long been of interest to human factors researchers. RESULTS: Experts depend heavily on domain-specific knowledge for superior performance, and such knowledge enables the expert to anticipate and prepare for future actions more efficiently. Training programs designed to focus learners' attention on task-related knowledge and skills critical to expert performance have shown promise in facilitating skill acquisition among nonexperts and in reducing errors by experts on representative tasks. CONCLUSIONS: Although significant challenges remain, there is encouraging progress in domains such as sports, aviation, and medicine in understanding some of the mechanisms underlying human expertise and in structuring training and tools to improve skilled performance. APPLICATIONS: Knowledge engineering techniques can capture expert knowledge and preserve it for organizations and for the development of expert systems. Understanding the mechanisms that underlie expert performance may provide insights into the structuring of better training programs for improvingskill and in designing systems to support professional expertise.  相似文献   

14.
Researchers and educators continue to explore how to assist students in the acquisition of conceptual understanding of complex science topics. While hypermedia learning environments (HLEs) afford unique opportunities to display multiple representations of these often abstract topics, students who do not engage in self-regulated learning (SRL) with HLEs often fail to achieve conceptual understanding. There is a lack of research regarding how student characteristics, such as prior knowledge and students’ implicit theory of intelligence (ITI), interact with SRL to influence academic performance. In this study, structural equation modeling was used to investigate these issues. It was found that prior knowledge and ITI were related to SRL and performance, and that SRL acted as a benevolent moderator, enhancing the positive effects of prior knowledge upon learning, and diminishing the negative effects of having a maladaptive ITI.  相似文献   

15.
The formulation of a problem may be defined as a process of acquisition and organization of knowledge related to a given situation, on which a decision maker projects some action. The assistance in the problem formulation that we may expect within decision support systems is difficult to design and to implement. This is mainly due to the frequent lack of attention to a sufficiently formalized conceptual framework which would consider the decision with a more cognition sciences oriented approach. In the first part, we will present an instrumental model for the study of decision processes as an attempt to simulate the cognitive process of knowledge acquisition and organization carried out by a decision maker facing a problematic situation. Considering its epistemological foundations, this model can be named “cognitivist model”. Within this model, the decision is defined as a cognitive construction which we call “decisional construct”. It consists of the elaboration of one or several abstract representations of the problematic situation (formulation phase), and the design of operational models (solving phase). In the second part, we will present the COGITA project, which consists of the design and realization of an environment for the development of problem formulation assistance systems. The modelization and simulation of cognitive processes call for relevant techniques originating either in artificial intelligence or in connectionism. We will show which are the main characteristics, potentials, limits and complementarity of these techniques and why their integration is fundamental and necessary to the simulation of the cognitive process associated with the formulation. COGITA is a hybrid system currently under development which tends to integrate symbolic artificial intelligence techniques and connectionist models in a cooperative hybridation the general architecture of which is presented.  相似文献   

16.
The importance of the efforts to bridge the gap between the connectionist and symbolic paradigms of artificial intelligence has been widely recognized. The merging of theory (background knowledge) and data learning (learning from examples) into neural-symbolic systems has indicated that such a learning system is more effective than purely symbolic or purely connectionist systems. Until recently, however, neural-symbolic systems were not able to fully represent, reason, and learn expressive languages other than classical propositional and fragments of first-order logic. In this article, we show that nonclassical logics, in particular propositional temporal logic and combinations of temporal and epistemic (modal) reasoning, can be effectively computed by artificial neural networks. We present the language of a connectionist temporal logic of knowledge (CTLK). We then present a temporal algorithm that translates CTLK theories into ensembles of neural networks and prove that the translation is correct. Finally, we apply CTLK to the muddy children puzzle, which has been widely used as a test-bed for distributed knowledge representation. We provide a complete solution to the puzzle with the use of simple neural networks, capable of reasoning about knowledge evolution in time and of knowledge acquisition through learning.  相似文献   

17.
基于因果图的一种知识获取方法   总被引:4,自引:0,他引:4  
王洪春 《计算机仿真》2006,23(3):126-128
产生式规则和因果图是知识表示的两种方法,鉴于产生式规则在表达知识和推理方面的缺陷或不足,因此寻找一种能更好地表达知识和推理的方法非常必要,而因果图具有表达知识直观,推理灵活、方便等特点。论文根据模糊式产生式规则与因果图,以及合成式模糊产生式规则与含与门、或门的因果图的对应关系,给出了将模糊产生式规则集表示的知识转换成更紧凑、直观因果图表示的方法和过程,相应的也得到了一个因果图知识的获取方法,并给了一个其转换的实例。  相似文献   

18.
A simple approach for point-based object capturing and rendering   总被引:1,自引:0,他引:1  
Point-based object representations are a powerful alternative to traditional polygonal object representations. Capturing 3D geometry is a mission-critical content acquisition technique in application domains such as virtual reality, CAD/CAM, and physical asset management. We describe SPOC, a simple, point-based object capturing system we've developed to automatically capture and process 3D geometry for point-based image rendering. We also identify the problems involved and describe the technical solutions we've implemented. Finally, we address several algorithmic issues in capturing point models using a simple digital camera and turntable setup, and in processing and rendering point clouds.  相似文献   

19.
Model checkers were originally developed to support the formal verification of high-level design models of distributed system designs. Over the years, they have become unmatched in precision and performance in this domain. Research in model checking has meanwhile moved towards methods that allow us to reason also about implementation level artifacts (e.g., software code) directly, instead of hand-crafted representations of those artifacts. This does not mean that there is no longer a place for the use of high-level models, but it does mean that such models are used in a different way today. In the approach that we describe here, high-level models are used to represent the environment for which the code is to be verified, but not the application itself. The code of the application is now executed as is by the model checker, while using powerful forms of abstraction on-the-fly to build the abstract state space that guides the verification process. This model-driven code checking method allows us to verify implementation level code efficiently for high-level safety and liveness properties. In this paper, we give an overview of the methodology that supports this new paradigm of code verification.  相似文献   

20.
The metamodeling platform ADONIS has originally been implemented for the use in business process management. Its method independency and extensive customization functionalities also allow for the application in many different other areas such as strategic management, e-learning, object-oriented systems engineering, knowledge management, and numerous others. In computer science the Unified Modeling Language (UML) is the dominating standard for describing systems and behaviours. In this article it is shown how the abstract and concrete syntax of UML statechart diagrams can be described by the use of the metamodeling concepts of ADONIS.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号