首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A rule-based approach for the automatic enforcement of consistency constraints is presented. In contrast to existing approaches that compile consistency checks into application programs, the approach centralizes consistency enforcement in a separate module called a knowledge-base management system. Exception handlers for constraint violations are represented as rule entities in the knowledge base. For this purpose, a new form of production rule called the activation pattern controlled rule is introduced: in contrast to classical forward chaining schemes, activation pattern controlled rules are triggered by the intent to apply a specific operation but not necessarily by the result of applying this operation. Techniques for implementing this approach are discussed, and experiments in speeding up the system performance are described. Furthermore, an argument is made for more tolerant consistency enforcement strategies, and how they can be integrated into the rule-based approach to consistency enforcement is discussed  相似文献   

2.
Tables appearing in natural language documents provide a compact method for presenting relational information in an immediate and intuitive manner, while simultaneously organizing and indexing that information. Despite their ubiquity and obvious utility, tables have not received the same level of formal characterization enjoyed by sentential text. Rather, they are modeled in terms of geometry, simple hierarchies of strings and database-like relational structures. Tables have been the focus of a large volume of research in the document image analysis field and lately, have received particular attention from researchers interested in extracting information from non-trivial elements of web pages. This paper provides a framework for representing tables at both the semantic and structural levels. It presents a representation of the indexing structures present in tables and the relationship between these structures and the underlying categories. Matthew Hurst graduated from Edinburgh University in 1992 and completed an MPhil at Cambridge in Computer Speech and Language Processing. He then worked at The University of Edinburgh on a number of projects involving text and document analysis before enroling in the PhD programme. While studying for his PhD, he completed a European Science and Technology Fellowship in Japan. After working for IBM Research, Tokyo he moved tothe United States of America to work for a number of companies with unique applications utilizing applied natural language processing and document analysis. He is currently the Director of Science and Innovation at Nielsen BuzzMetrics.  相似文献   

3.
4.
We develop a formal framework to give computer programs an abstract interpretation as information transformers. Then the quantitative relation between input and output information is investigated. Our theory is based oninformation domains, a refinement of the classical domains used in denotational semantics, and on the theory of abstract interpretation of functional languages.  相似文献   

5.
6.
《Information Systems》2005,30(6):444-466
Multimedia metacomputing is a new approach to the management and processing of multimedia data in web-based information systems. It offers high flexibility and openness while shielding the applications from any system internals. Starting with the vision of a completely open and globally distributed multimedia information system, we consider abstraction concepts required, especially transformation independence, and an appropriate semantic model.Thus, the major focus of this paper is on the abstract data and processing model called VirtualMedia,1 which provides a transformation independence framework for multimedia processing. In particular, we describe how transformation requests are represented and processed, exploiting semantic equivalence relations on filter graphs and redundant materialization, finally yielding instantiatable plans for materializing the requested media object(s) at the client.  相似文献   

7.
8.
Imen Zghidi 《Constraints》2017,22(1):101-102
In most industrial contexts, decisions are made based on incomplete information. This is due to the fact that decision makers cannot be certain of the future behavior of factors that will affect the outcome resulting from various options under consideration. Stochastic Constraint Satisfaction Problems provide a powerful modeling framework for problems in which one is required to take decisions under uncertainty. In these stochastic problems, the uncertainty is modeled by using discrete random variables to capture uncontrollable factors like the customer demands, the processing times of machines, house prices, etc. These discrete random variables can take on a set of possible different values, each with an associated probability and are useful to model factors that fall outside the control of the decision maker who only knows the probability distribution function of these random variables which can be forecasted, for instance, by looking at the past behavior of such factors. There are controllable variables on which one can decide, named decision variables which allow to model the set of possible choices for the decisions to be made. Finally, such problems comprise chance constraints which express the relationship between random and decision variables that should be satisfied within a satisfaction probability threshold – since finding decisions that will always satisfy the constraints in an uncertain environment is almost impossible.If the random variables’ support set is infinite, the number of scenarios would be infinite. Hence, finding a solution in such cases is impossible in general. In this thesis, within the context of an infinite set of scenarios, we propose a novel notion of statistical consistency. Statistical consistency lifts the notion of consistency of deterministic constraints to infinite chance constraints. The essence of this novel notion of consistency is to be able to make an inference, in the presence of infinite scenarios in an uncertain environment, based on a restricted finite subset of scenarios with a certain confidence level and a threshold error. The confidence level is the probability that characterises the extent to which our inference, based on a subset of scenarios, is correct whereas the threshold error is the error range that we can tolerate while making such an inference. The statistical consistency acknowledges the fact that making a perfect inference in an uncertain environment and with an infinite number of scenarios is impossible. The statistical consistency, thus, with its reliance on a limited number of scenarios, a confidence level, and a threshold error constitutes a valid and an appropriate practical road that one can take in order to tackle infinite chance constraints.We design two novel approaches based on confidence intervals to enforce statistical consistency as well as a novel third approach based on hypothesis testing. We analyze the various methods theoretically as well as experimentally. Our empirical evaluation shows the weaknesses and strengths of each of the three methods in making a correct inference from a restricted subset of scenarios for enforcing statistical consistency. Overall, while the first two methods are able to make a correct inference in most of the cases, the third is a superior, effective, and robust one in all cases.  相似文献   

9.
Virtual Reality (VR), by its nature and characteristics, is of specific interest to the AI community, particularly in the domains of Storytelling and Intelligent Characters. We argue that VR must be considered a particular narrative medium alongside Theatre, Literature or Cinema. This paper reviews relevant work in narrative theory from Plato onwards, including the work and theories of literary critics [1], cinema critics [2–4] and theatrical dramaturges [5], and analyses the specific characteristics of VR relevant to this theory. Less studied media such as Live Role Playing Games, improvisational drama and participatory drama are also considered. Finally, this document argues for a participatoryprocess-oriented narrative, with particular attention to the specificities and particularities of stories and their possible representation, adapted to the narrative medium Virtual Reality.  相似文献   

10.
Towards a general theory of topological maps   总被引:1,自引:0,他引:1  
  相似文献   

11.
Turi and Plotkin gave a precise mathematical formulation of a notion of structural operational semantics in their paper “Towards a mathematical operational semantics.” Starting from that definition and at the level of generality of that definition, we give a mathematical formulation of some of the basic constructions one makes with structural operational semantics. In particular, given a single-step operational semantics, as is the spirit of their work, one composes transitions and considers streams of transitions in order to study the dynamics induced by the operational semantics. In all their leading examples, it is obvious that one can do that and it is obvious how to do it. But if their definition is to be taken seriously, one needs to be able to make such constructions at the level of generality of their definition rather than case-by-case. So this paper does so for several of the basic constructions associated with structural operational semantics, in particular those required in order to speak of a stream of transitions and hence of dynamics.  相似文献   

12.
13.
Parsimonious covering offers an alternative to rules for building diagnostic expert systems. Abductive paradigms, such as parsimonious covering, are a departure from the forward-chaining, rule-based approach, which is based on deduction. Parsimonious covering addresses weaknesses of rule-based systems where the diagnosis may contain multiple faults or disorders, or where the need to include all the necessary context for each rule's application in the antecedent clauses of each rule would make the representation of the knowledge base too large or overly complex.

In this paper, we compare the notions of deterministic covering and the probabilistic causal model with two fuzzy analogies: fuzzy subsethood and fuzzy similarity. Monotonic upper and lower bounds for fuzzy similarity are derived, and pruning opportunities are identified for search through the power set of disorders, given a measured, crisp manifestation set.  相似文献   


14.
A formalism for reasoning about actions is proposed that is based on a temporal logic. It allows a much wider range of actions to be described than with previous approaches such as the situation calculus. This formalism is then used to characterize the different types of events, processes, actions, and properties that can be described in simple English sentences. In addressing this problem, we consider actions that involve non-activity as well as actions that can only be defined in terms of the beliefs and intentions of the actors. Finally, a framework for planning in a dynamic world with external events and multiple agents is suggested.  相似文献   

15.
In this paper a theory of delegation is presented. There are at least three reasons for developing such a theory. First, one of the most relevant notions of “agent” is based on the notion of “task” and of “on behalf of”. In order to found this notion a theory of delegation among agents is needed. Second, the notion of autonomy should be based on different kinds and levels of delegation. Third, the entire theory of cooperation and collaboration requires the definition of the two complementary attitudes of goal delegation and adoption linking collaborating agents.

After motivating the necessity for a principled theory of delegation (and adoption) the paper presents a plan-based approach to this theory. We analyze several dimensions of the delegation/adoption (on the basis of the interaction between the agents, of the specification of the task, of the possibility to subdelegate, of the delegation of the control, of the help levels). The agent's autonomy and levels of agency are then deduced. We describe the modelling of the client from the contractor's point of view and vice versa, with their differences, and the notion of trust that directly derives from this modelling.

Finally, a series of possible conflicts between client and contractor are considered: in particular collaborative conflicts, which stem from the contractor's intention to help the client beyond its request or delegation and to exploit its own knowledge and intelligence (reasoning, problem solving, planning, and decision skills) for the client itself.  相似文献   


16.
The paper outlines the approach to the analysis of deontic conditionals taken in earlier work of Jones and Pörn, comparing it briefly with two main trends within dyadic deontic logic, and discussing problems associated with the augmentation principle and the factual detachment principle. A modification of the Jones and Pörn system is then presented, using a classical but not normal (in the sense of Chellas) deontic modality, to provide the basis for an alternative analysis of deontic conditionals. This new analysis validates neither the factual detachment nor the augmentation principles. However, influenced by the approach of Delgrande to default reasoning, it is shown how a restricted form of factual detachment might be accommodated within the revised system.  相似文献   

17.
The present paper defines a numerical sequence on the integers as being of random type if it has a serial correlation function tending to 0 at infinity, and if it is completely distributed. As well as the theoretical probabilistic content, techniques useful to statisticians and information scientists are given. A criterion is developed to verify whether a given real sequence includes a singular part. If not, techniques of harmonic analysis are used to extract from the sequence its component of random type with regard to its correlation function. Conversely, an algorithm is given to produce a sequence of random type, having given distribution functions and a correlation function that is stochastically as near as desired to a given one. The algorithm consists of deterministic mathematical techniques; the sequence can thus be computer generated.  相似文献   

18.
The present paper defines a numerical sequence on the integers as being of random type if it has a serial correlation function tending to 0 at infinity, and if it is completely distributed. As well as the theoretical probabilistic content, techniques useful to statisticians and information scientists are given. A criterion is developed to verify whether a given real sequence includes a singular part. If not, techniques of harmonic analysis are used to extract from the sequence its component of random type with regard to its correlation function. Conversely, an algorithm is given to produce a sequence of random type, having given distribution functions and a correlation function that is stochastically as near as desired to a given one. The algorithm consists of deterministic mathematical techniques; the sequence can thus be computer generated.  相似文献   

19.
在[n]值Lukasiewicz命题逻辑系统中,提出理论的随机相容度的概念,并指出理论的随机相容度是和概率分布列的选取相关的。最后证明了理论的随机相容度在[n]值随机逻辑度量空间中,同样保持经典逻辑度量空间中的基本性质。  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号