首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Nelson and Oppen provided a methodology for modularly combining decision procedures for individual theories to construct a decision procedure for a combination of theories. In addition to providing a check for satisfiability, the individual decision procedures need to provide additional functionalities, including equality generation.In this paper, we propose a decision procedure for a conjunction of difference constraints over rationals (where the atomic formulas are of the form xy+c or x<y+c). The procedure extends any negative cycle detection algorithm (like the Bellman-Ford algorithm) to generate (1) equalities between all pair of variables, (2) produce proofs and (3) generates models that can be extended by other theories in a Nelson-Oppen framework. All the operations mentioned above can be performed with only a linear overhead to the cycle detection algorithm, in the average case.  相似文献   

2.
Verification problems can often be encoded as first-order validity or satisfiability problems. The availability of efficient automated theorem provers is a crucial pre-requisite for automating various verification tasks as well as their cooperation with specialized decision procedures for selected theories, such as Presburger Arithmetic. In this paper, we investigate how automated provers based on a form of equational reasoning, called paramodulation, can be used in verification tools. More precisely, given a theory T axiomatizing some data structure, we devise a procedure to answer the following questions. Is the satisfiability problem of T decidable by paramodulation? Can a procedure based on paramodulation for T be efficiently combined with other specialized procedures by using the Nelson-Oppen schema? Finally, if paramodulation decides the satisfiability problem of two theories, does it decide satisfiability in their union?The procedure capable of answering all questions above is based on Schematic Saturation; an inference system capable of over-approximating the inferences of paramodulation when solving satisfiability problems in a given theory T. Clause schemas derived by Schematic Saturation describe all clauses derived by paramodulation so that the answers to the questions above are obtained by checking that only finitely many different clause schemas are derived or that certain clause schemas are not derived.  相似文献   

3.
When rewriting is used to generate convergent and complete rewrite systems in order to answer the validity problem for some theories, all the rewriting theories rely on a same set of notions, properties, and methods. Rewriting techniques have been used mainly to answer the validity problem of equational theories, that is, to compute congruences. Recently, however, they have been extended in order to be applied to other algebraic structures such as preorders and orders. In this paper, we investigate an abstract form of rewriting, by following the paradigm of logical-system independency. To achieve this purpose, we provide a few simple conditions (or axioms) under which rewriting (and then the set of classical properties and methods) can be modeled, understood, studied, proven, and generalized. This enables us to extend rewriting techniques to other algebraic structures than congruences and preorders such as congruences closed under monotonicity and modus ponens. We introduce convergent rewrite systems that enable one to describe deduction procedures for their corresponding theory, and we propose a Knuth-Bendix–style completion procedure in this abstract framework.  相似文献   

4.
Global viewing of heterogeneous data sources   总被引:10,自引:0,他引:10  
The problem of defining global views of heterogeneous data sources to support querying and cooperation activities is becoming more and more important due to the availability of multiple data sources within complex organizations and in global information systems. Global views are defined to provide a unified representation of the information in the different sources by analyzing conceptual schemas associated with them and resolving possible semantic heterogeneity. We propose an affinity based unification method for global view construction. In the method: (1) the concept of affinity is introduced to assess the level of semantic relationship between elements in different schemas by taking into account semantic heterogeneity; (2) schema elements are classified by affinity levels using clustering procedures so that their different representations can be analyzed for unification; (3) global views are constructed starting from selected clusters by unifying representations of their elements. Experiences of applying the proposed unification method and the associated tool environment ARTEMIS on databases of the Italian Public Administration information systems are described  相似文献   

5.
The Semantic Web is the next step of the current Web where information will become more machine-understandable to support effective data discovery and integration. Hierarchical schemas, either in the form of tree-like structures (e.g., DTDs, XML schemas), or in the form of hierarchies on a category/subcategory basis (e.g., thematic hierarchies of portal catalogs), play an important role in this task. They are used to enrich semantically the available information. Up to now, hierarchical schemas have been treated rather as sets of individual elements, acting as semantic guides for browsing or querying data. Under that view, queries like “find the part of a portal catalog which is not present in another catalog” can be answered only in a procedural way, specifying which nodes to select and how to get them. For this reason, we argue that hierarchical schemas should be treated as full-fledged objects so as to allow for their manipulation. This work proposes models and operators to manipulate the structural information of hierarchies, considering them as first-class citizens. First, we explore the algebraic properties of trees representing hierarchies, and define a lattice algebraic structure on them. Then, turning this structure into a boolean algebra, we present the operators S-union, S-intersection and S-difference to support structural manipulation of hierarchies. These operators have certain algebraic properties to provide clear semantics and assist the transformation, simplification and optimization of sequences of operations using laws similar to those of set theory. Also, we identify the conditions under which this framework is applicable. Finally, we demonstrate an application of our framework for manipulating hierarchical schemas on tree-like hierarchies encoded as RDF/s files.  相似文献   

6.
Standards for XML and Web services security   总被引:1,自引:0,他引:1  
Naedele  M. 《Computer》2003,36(4):96-98
XML schemas convey the data syntax and semantics for various application domains, such as business-to-business transactions, medical records, and production status reports. However, these schemas seldom address security issues, which can lead to a worst-case scenario of systems and protocols with no security at all. At best, they confine security to transport level mechanisms such as secure sockets layer (SSL). On the other hand, the omission of security provisions from domain schemas opens the way for generic security specifications based on XML document and grammar extensions. These specifications are orthogonal to domain schemas but integrate with them to support a variety of security objectives, such as confidentiality, integrity, and access control. In 2002, several specifications progressed toward providing a comprehensive standards framework for secure XML-based applications. The paper shows some of the most important specifications, the issues they address, and their dependencies.  相似文献   

7.
Heating, Ventilation and Air-Conditioning (HVAC) systems account for more than 15% of the total energy consumption in the US. In order to improve the energy efficiency of HVAC systems, researchers have developed hundreds of algorithms to automatically analyze their performance. However, the complex information, such as configurations of HVAC systems, layouts and materials of building elements and dynamic data from the control systems, required by these algorithms inhibits the process of deploying them in real-world facilities. To address this challenge, we envision a framework that automatically integrates the required information items and provides them to the performance analysis algorithms for HVAC systems. This paper presents an approach to identify and document the information requirements from the publications that describe these algorithms. We extend the Information Delivery Manual (IDM) approach so that the identified information requirements can be mapped to multiple information sources that use various formats and schemas. This paper presents the extensions to the IDM approach and the results of using it to identify information requirements for performance analysis algorithms of HVAC systems.  相似文献   

8.
If there exist efficient procedures (canonizers) for reducing terms of two first-order theories to canonical form, can one use them to construct such a procedure for terms of the disjoint union of the two theories? We prove this is possible whenever the original theories are convex. As an application, we prove that algorithms for solving equations in the two theories (solvers) can not be combined in a similar fashion. These results are relevant to the widely used Shostak’s method for combining decision procedures for theories. They provide the first rigorous answers to the questions about the possibility of directly combining canonizers and solvers.  相似文献   

9.
10.
Robotic manipulation systems that operate in unstructured environments must be responsive to feedback from sensors that are disparate in both location and modality. This paper describes a distributed framework for assimilating the disparate feedback provided by force and vision sensors, including active vision sensors, for robotic manipulation systems. The main components of the expectation-based framework include object schemas and port-based agents. Object schemas represent the manipulation task internally in terms of geometric models with attached sensor mappings. Object schemas are dynamically updated by sensor feedback, and thus provide an ability to perform three dimensional spatial reasoning during task execution. Because object schemas possess knowledge of sensor mappings, they are able to both select appropriate sensors and guide active sensors based on task characteristics. Port-based agents are the executors of reference inputs provided by object schemas and are defined in terms of encapsulated control strategies. Experimental results demonstrate the capabilities of the framework in two ways: the performance of manipulation tasks with active camera-lens systems, and the assimilation of force and vision sensory feedback.  相似文献   

11.
This article ties together previously scattered research on discontinuous grammars—logic grammars in which non-explicit sequences of symbols can be alluded to in the rules and sometimes repositioned by them. After an introduction, we define them formally, present their background, and provide intuitive insight into their use. Next, we examine several motivating arguments, from both formal and natural language processing viewpoints, and we discuss the static discontinuity family of these grammars, in which (a) the nonexplicit strings are not allowed to move and (b) linguistic constraints specifically designed to suit, in particular, Government and Binding theory can be defined modularly and statically in terms of node domination in parse trees and are enforced dynamically. Finally, we discuss implementation issues, related work, and extensions.  相似文献   

12.
XML and XML Schema are used in the geospatial domain for the definition of standards that enhance the interoperability between producers and consumers of spatial data. The size and complexity of these geospatial standards and their associated schemas have been growing with time reaching levels of complexity that make it difficult to build systems based on them in a timely and cost-effective manner. The problem of producing XML processing code based on large schemas has been traditionally solved by using XML data binding generators. Unfortunately, this solution is not always effective when code is generated for resource-constrained devices, such as mobile phones. Large and complex schemas often result in the production of code with a large size and a complicated structure that might not fit the device limitations. In this article we present the instance-based XML data binding approach to produce more compact application-specific XML processing code for geospatial applications targeted to mobile devices. The approach tries to reduce the size and complexity of the generated code by using information about how schemas are used by individual applications. Our experimental results suggest a significant simplification of XML Schema sets to the real needs of client applications accompanied by a substantial reduction of size of the generated code.  相似文献   

13.
14.
Since the mid-twentieth century, the concept of the Turing machine has dominated thought about effective procedures. This paper presents an alternative to Turing's analysis; it unifies, refines, and extends my earlier work on this topic. I show that Turing machines cannot live up to their billing as paragons of effective procedure; at best, they may be said to provide us with mere procedure schemas. I argue that the concept of an effective procedure crucially depends upon distinguishing procedures as definite courses of action(- types) from the particular courses of action(-tokens) that actually instantiate them and the causal processes and/or interpretations that ultimately make them effective. On my analysis, effectiveness is not just a matter of logical form; `content' matters. The analysis I provide has the advantage of applying to ordinary, everyday procedures such as recipes and methods, as well as the more refined procedures of mathematics and computer science. It also has the virtue of making better sense of the physical possibilities for hypercomputation than the received view and its extensions, e.g. Turing's o-machines, accelerating machines.  相似文献   

15.
Formal approaches to the design of interactive systems rely on reasoning about properties of the system at a very high level of abstraction. Specifications to support such an approach typically provide little scope for reasoning about presentations and the representation of information in the presentation. In contrast, psychological theories such as distributed cognition place a strong emphasis on the role of representations, and their perception by the user, in the cognitive process. However, the post-hoc techniques for the observation and analysis of existing systems which have developed out of the theory do not help us in addressing such issues at the design stage. Mn this paper we show how a formalisation can be used to investigate the representational aspects of an interface. Our goal is to provide a framework to help identify and resolve potential problems with the representation of information, and to support understanding of representational issues in design. We present a model for linking properties at the abstract and perceptual levels, and illustrate its use in a case study of a ight deck instrument. There is a widespread consensus that proper tool support is a prerequisite for the adoption of formal techniques, but the use of such tools can have a profound effect on the process itself. In order to explore this issue, we apply a higher-order logic theorem prover to the analysis. Received May 1999 / Accepted in revised form July 2000  相似文献   

16.
Advances in e-learning technologies parallels a general increase in sophistication by computer users. The use of just one theory or model, such as the technology acceptance model, is no longer sufficient to study the intended use of e-learning systems. Rather, a combination of theories must be integrated in order to fully capture the complexity of e-learners, who are both system users and learners. The current research presents an integrated theoretical framework to study users’ acceptance of streaming media for e-learning. Three streams of research provide the basis for this integrated framework: the technology acceptance model, flow theory and media richness theory. Students enrolled in an online section of an information systems course used one of three different combinations of text, streamed audio and streamed video. Regression analysis was used to test the hypotheses in this field experiment. Perceived ease of use was a predictor of perceived usefulness; both the perceived usefulness and the attitude of the user were predictors of intention to use. Richer content-presentation types were positively correlated with higher concentration levels but showed mixed results when correlated with perceived usefulness. Results from this study have practical implications for those interested in integrating streaming media into e-learning.  相似文献   

17.
Conclusion In this survey we have not considered any of the issues related to the design of problemoriented machine proving systems and the use of heuristic methods. This does not mean that these issues are ignored by the scientific community. On the contrary, combination of mathematical-logic rules with heuristics in machine proving procedures is a necessary condition for achieving useful results. An important example of such combination is the mixed unification considered in subsec. 2.4. A logical conclusion of this approach should involve replacing the general unification algorithm with special-purpose equation solvers, assuming that this is allowed by the relevant theory. (Such an approach has already been sketched in logic programming [56].) There are examples of such heuristics, but their development is not always accompanied by sufficiently complete mathematical argument, which unfortunately is characteristic of many propositions in the heuristic framework.Translated from Kibernetika, No. 3, pp. 34–41, May–June, 1986.  相似文献   

18.
ContextIt is critical to ensure the quality of a software system in the initial stages of development, and several approaches have been proposed to ensure that a conceptual schema correctly describes the user’s requirements.ObjectiveThe main goal of this paper is to perform automated reasoning on UML schemas containing arbitrary constraints, derived roles, derived attributes and queries, all of which must be specified by OCL expressions.MethodThe UML/OCL schema is encoded in a first order logic formalisation, and an existing reasoning procedure is used to check whether the schema satisfies a set of desirable properties. Due to the undecidability of reasoning in highly expressive schemas, such as those considered here, we also provide a set of conditions that, if satisfied by the schema, ensure that all properties can be checked in a finite period of time.ResultsThis paper extends our previous work on reasoning on UML conceptual schemas with OCL constraints by considering derived attributes and roles that can participate in the definition of other constraints, queries and derivation rules. Queries formalised in OCL can also be validated to check their satisfiability and to detect possible equivalences between them. We also provide a set of conditions that ensure finite reasoning when they are satisfied by the schema under consideration.ConclusionThis approach improves upon previous work by allowing automated reasoning for more expressive UML/OCL conceptual schemas than those considered so far.  相似文献   

19.
We present a new method for mapping ontology schemas that address similar domains. The problem of ontology matching is crucial since we are witnessing a decentralized development and publication of ontological data. We formulate the problem of inferring a match between two ontologies as a maximum likelihood problem, and solve it using the technique of expectation-maximization (EM). Specifically, we adopt directed graphs as our model for ontology schemas and use a generalized version of EM to arrive at a map between the nodes of the graphs. We exploit the structural, lexical and instance similarity between the graphs, and differ from the previous approaches in the way we utilize them to arrive at, a possibly inexact, match. Inexact matching is the process of finding a best possible match between the two graphs when exact matching is not possible or is computationally difficult. In order to scale the method to large ontologies, we identify the computational bottlenecks and adapt the generalized EM by using a memory bounded partitioning scheme. We provide comparative experimental results in support of our method on two well-known ontology alignment benchmarks and discuss their implications.  相似文献   

20.
The Nelson-Oppen combination method combines decision procedures for first-order theories over disjoint signatures into a single decision procedure for the union theory. To be correct, the method requires that the component theories be stably infinite. This restriction makes the method inapplicable to many interesting theories such as, for instance, theories having only finite models.In this paper we provide a new combination method that can combine any theory that is not stably infinite with another theory, provided that the latter is what we call a shiny theory. Examples of shiny theories include the theory of equality, the theory of partial orders, and the theory of total orders.An interesting consequence of our results is that any decision procedure for the satisfiability of quantifier-free Σ-formulae in a Σ-theory T can always be extended to accept inputs over an arbitrary signature Ω Σ.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号