首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
We present an improvement of SATCHMORE, calledA-SATCHMORE, by incorporating availability checking into relevancy. Because some atoms unavailable to the further computation are also marked relevant, SATCHMORE suffers from a potential explosion of the search space. Addressing this weakness of SATCHMORE, we show that an atom does not need to be marked relevant unless it is available to the further computation and no non-Horn clause needs to be selected unless all its consequent atoms are marked availably relevant, i.e., unless it is totally availably relevant. In this way,A-SATCHMORE is able to further restrict the ues of non-Horn clauses (therefore to reduce the search space) and makes the proof more goal-oriented. Our theorem prover,A-SATCHMORE, can be simply implemented in PROLOG based on SATCHMORE. We discuss how to incorporate availability cheeking into relevancy, describe our improvement and present the implementation. We also prove that our theorem prover is sound and complete, and provide examples to show the power of our availability approach. This research is supported in part by the Japanese Ministry of Education and the Artificial Intelligence Research Promotion Foundation. Lifeng He, Ph.D: He received the B. E. degree from Northwest Institute of Light Industry, China, in 1982, the M. S. and Ph.D. degrees in AI and computer science from Nagoya Institute of Technology, Japan, in 1994 and 1997, respectively. He currently works at the Institute of Open System in Nagoya, Japan. His research interests include automated reasoning, theorem proving, logic programming, knowledge bases, multi-agent cooperation and modal logic. Yuyan Chao, M. S.: She received the B. E. degree from Northwest Institute of Light Industry, China, in 1984, and the M. S. degree from Nagoya University, Japan, in 1997. She is currently a doctoral candidate in the Department of Human Information, Nagoya University. Her research interests include image processing, graphic understanding, CAD and theorem proving. Yuka Shimajiri, M. S.: She currently works as a Assistant Professor in Department of Artificial Intelligence and Computer Science at the Nagoya Institute of Technology. She received her B.Eng. and M.Eng. from the Nagoya Institute of Technology in 1994 and 1996, respectively. Her current research interests include logic programming and automated deduction. She is a member of IPSJ and JSAI. Hirohisa Seki, Ph.D.: He received the B. E., M. E. and Ph.D degrees from the University of Tokyo in 1979, 1981 and 1991 respectively. He joined the Central Research Laboratory of Mitsubishi Electric Corporation in 1981. From 1985 to 1989, he was with the Institute for New Generation Computer Technology (ICOT). Since 1992, he has been an Associate Professor in the Department of AI and Computer Science at Nagoya Institute of Technology. His current research interests include logic programming, deductive databases and automated deduction. He is a member of ACM, IEEE, IPSJ and JSAI. Hidenori Itoh, Ph.D.: He received the B. S. degree from Fukui University, in 1969, the M. S. degree and Ph.D degree from Nagoya University, Japan, in 1971 and 1974, respectively. From 1974 to 1985, he worked at Nippon Telephone and Telegraph Laboratories, developing operating systems. From 1985 to 1989, he was with the Institute for New Generation Computer Technology, developing knowledge base systems. Since 1989, he has become a professor at the Nagoya Institute of Technology. His current research interests include image processing, parallel computing, fuzzy logic and knowledge processing.  相似文献   

3.
Declarative languages for deductive and object-oriented databases require some high-level mechanism for specifying semantic control knowledge. This paper proposes user-supplied subsumption information as a paradigm to specify desired, prefered or useful deductions at the meta level. For this purpose we augment logic programming by subsumption relations and succeed to extend the classical theorems for least models, fixpoints and bottom-up evaluation accordingly. Moreover, we provide a differential fixpoint operator for efficient query evaluation in deductive databases. This operator discards subsumed tuples on the fly. We also exemplify the ease of use of this programming methodology. In particular, we demonstrate how heuristic AI search procedures can be integrated into deductive databases in this way.This article is a revised and extended version of (Köstler et al., 1993)  相似文献   

4.
A translation from higher-order logic (on top of the simply typed λ-calculus) to propositional logic is presented, such that the resulting propositional formula is satisfiable iff the HOL formula has a model of a given finite size. A standard SAT solver can then be used to search for a satisfying assignment, and such an assignment can be transformed back into a model for the HOL formula. The algorithm has been implemented in the interactive theorem prover Isabelle/HOL, where it is used to automatically generate countermodels for non-theorems.  相似文献   

5.
Uncertainty in deductive databases and logic programming has been modeled using a variety of (numeric and non-numeric) formalisms in the past, including probabilistic, possibilistic, and fuzzy set-theoretic approaches, and many valued logic programming. In this paper, we consider a hybrid approach to the modeling of uncertainty in deductive databases. Our model, called deductive IST (DIST) is based on an extension of the Information Source Tracking (IST) model, recently proposed for relational databases. The DIST model permits uncertainty to be modeled and manipulated in essentially qualitative terms with an option to convert qualitative expressions of uncertainty into numeric form (e.g., probabilities). An uncertain deductive database is modeled as a Horn clause program in the DIST framework, where each fact and rule is annotated with an expression indicating the “sources” contributing to this information and their nature of contribution. (1) We show that positive DIST programs enjoy the least model/least fixpoint semantics analogous to classical logic programs. (2) We show that top-down (e.g., SLD-resolution) and bottom-up (e.g., magic sets rewriting followed by semi-naive evaluation) query processing strategies developed for datalog can be easily extended to DIST programs. (3) Results and techniques for handling negation as failure in classical logic programming can be easily extended to DIST. As an illustration of this, we show how stratified negation can be so extended. We next study the problem of query optimization in such databases and establish the following results. (4) We formulate query containment in qualitative as well as quantitative terms. Intuitively, our qualitative sense of containment would say a query Q1 is contained in a query Q2 provided for every input database D, for every tuple t, t ε Q2(D) holds in every “situation” in which t ε Q1(D) is true. The quantitative notion of containment would say Q1 is contained in Q2 provided on every input, the certainty associated with any tuple computed by Q1 is no more than the certainty associated with the same tuple by Q2 on the given input. We also prove that qualitative and quantitative notions of containment (both absolute and uniform versions) coincide. (5) We establish necessary and sufficient conditions for the qualitative containment of conjunctive queries. (6) We extend the well-known chase technique to develop a test for uniform containment and equivalence of positive DIST programs. (7) Finally, we prove that the complexity of testing containment of conjunctive DIST queries remains the same as in the classical case when number of information sources is regarded as a constant (so, it's NP-complete in the size of the queries). We also show that testing containment of conjunctive queries is co-NP-complete in the number of information sources.  相似文献   

6.
This paper deals with the implementation of logic queries where array structures are manipulated. Both top-down and bottom-up implementations of the presented logic language, called Datalog A , are considered. Indeed, SLD-resolution is generalized to realize Datalog A top-down query answering. Further, a fixpoint based evaluation of Datalog A queries is introduced, which forms the basis for efficient bottom-up implementation of queries obtained by generalizing rewriting techniques such as magic set method to the case of Datalog A programs.Work partially supported by a European Union grant under the EC-US project DEUS EX MACHINA: nondeterminism for deductive databases and by a MURST grant (40% share) under the project Sistemi formali e strumenti per basi di dati evolute.  相似文献   

7.
The closely related research areas management of semistructured data and languages for querying the Web have recently attracted a lot of interest. We argue that languages supporting deduction and object-orientation (dood languages) are particularly suited in this context: Object-orientation provides a flexible common data model for combining information from heterogeneous sources and for handling partial information. Techniques for navigating in object-oriented databases can be applied to semistructured databases as well, since the latter may be viewed as (very simple) instances of the former. Deductive rules provide a powerful framework for expressing complex queries in a high-level, declarative programming style.

We elaborate on the management of semistructured data and show how reachability queries involving general path expressions and the extraction of data paths in the presence of cyclic data can be handled. We then propose a formal model for querying structure and contents of Web data and present its declarative semantics. A main advantage of our approach is that it brings together the above-mentioned issues in a unified, formal framework and—using the system—supports rapid prototyping and experimenting with all these features. Concrete examples illustrate the concise and elegant programming style supported by and substantiate the above-mentioned claims.  相似文献   


8.
9.
We present a formalization of the first 100 pages of Winskel's textbook The Formal Semantics of Programming Languages in the theorem prover Isabelle/HOL: 2 operational, 2 denotational, 2 axiomatic semantics, a verification condition generator, and the necessary soundness, completeness and equivalence proofs, all for a simple imperative programming language. Received March 1997 / Accepted in revised form June 1998  相似文献   

10.
The RDF(S) data model has been proposed for encoding metadata about Web resources. As more and more Web resources are annotated using RDF(S), there is an urgent need for efficiently dealing with this large volume of data. In this paper, we present Atlas, a peer-to-peer system for storing, updating and querying RDF(S) data. The Atlas system has been built using the distributed hash table Bamboo. Atlas was developed in the context of project OntoGrid, where it was used as a distributed repository for RDF(S) metadata describing Grid services and resources. The development of Atlas continues in other projects that our group participates currently. This paper gives an overview of the most recent version of Atlas and discusses a representative application.  相似文献   

11.
We propose a benchmark to compare theorem-proving systems on their ability to express proofs of compiler correctness. In contrast to the first POPLmark, we emphasize the connection of proofs to compiler implementations, and we point out that much can be done without binders or alpha-conversion. We propose specific criteria for evaluating the utility of mechanized metatheory systems; we have constructed solutions in both Coq and Twelf metatheory, and we draw conclusions about those two systems in particular.  相似文献   

12.
Abstract— Efficient near‐infrared (NIR) quantum cutting (QC) through cooperative downconversion in Y3Al5O12 : RE3+,Yb3+ (RE = Tb, Tm, and Pr) nanophosphors has been demonstrated, which involves the conversion of the visible photon from RE3+ into the NIR emission of Yb3+ with the optimal quantum efficiency approaching 200%. The authors have analyzed the measured luminescence spectra and decay lifetimes and propose a mechanism to rationalize the NIR QC effect. The results indicate the potential of developing RE3+‐Yb3 dual‐ion‐based nanophosphors for the downconversion of high‐energy photons to multiple photons with an energy greater than the bandgap of silicon‐based‐photonics display devices.  相似文献   

13.
A gradient-based model predictive control (MPC) strategy was recently proposed to reduce the computational burden derived from the explicit inclusion of an economic real time optimization (RTO). The main idea is to compute a suboptimal solution, which is the convex combination of a feasible solution and a solution of an approximated (linearized) problem. The main benefits of this strategy are that convergence is still guaranteed and good economic performances are obtained, according to several simulation scenarios. The formulation, however, is developed only for the nominal case, which significantly reduces its applicability. In this work, an extension of the gradient-based MPC to explicitly account for disturbances is made. The resulting robust formulation considers a nominal prediction model, but restricted constraints (in order to account for the effect of additive disturbances). The nominal economic performance is preserved and robust stability is ensured. An illustrative example shows the benefits of the proposal.  相似文献   

14.
The netCDF Operator (NCO) software facilitates manipulation and analysis of gridded geoscience data stored in the self-describing netCDF format. NCO is optimized to efficiently analyze large multi-dimensional data sets spanning many files. Researchers and data centers often use NCO to analyze and serve observed and modeled geoscience data including satellite observations and weather, air quality, and climate forecasts. NCO's functionality includes shared memory threading, a message-passing interface, network transparency, and an interpreted language parser. NCO treats data files as a high level data type whose contents may be simultaneously manipulated by a single command. Institutions and data portals often use NCO for middleware to hyperslab and aggregate data set requests, while scientific researchers use NCO to perform three general functions: arithmetic operations, data permutation and compression, and metadata editing. We describe NCO's design philosophy and primary features, illustrate techniques to solve common geoscience and environmental data analysis problems, and suggest ways to design gridded data sets that can ease their subsequent analysis.  相似文献   

15.
Our study empirically examined how Davis's Technology Acceptance Model (TAM) helped managers predict a user's intention to revisit a website and how this changed over time as a user gained experience of the Internet and the website. The user's experience of the website played a moderating role. For less experienced users, perceived ease of use was found to be a more important factor in deciding to revisit the website, whereas perceived usefulness had more effect on more experienced users. Thus, web designers can identify and remove web factors that hinder user acceptance and address underlying obstacles to post-adoption usage.  相似文献   

16.
This paper discusses a method for the analysis of dependable interactive systems using model checking, and its support by a tool designed to make it accessible to a broader community. The method and the tool are designed to be of value to system engineers, usability engineers and software engineers. It has been designed to help usability engineers by making those aspects of the analysis relevant to them explicit while concealing those aspects of modelling and model checking that are not relevant. The paper presents the results of a user evaluation of the effectiveness of aspects of the tool and how it supports the proposed method. The tool was constructed while both authors worked in the Human Computer Interaction Group. Department of Computer Science, University of York, UK.  相似文献   

17.
The exploitation of global Earth Observation data hinges increasingly on physically-based radiative transfer (RT) models. These models simulate the interactions of solar radiation within a given medium (e.g., clouds, plant canopies) and are used to generate look-up-tables that are embedded into quantitative retrieval algorithms, such as those delivering the operational surface products for MODIS, MISR and MERIS. An assessment of the quality of canopy RT models thus appears essential if accurate and reliable information is to be derived from them. Until recently such an undertaking was a time consuming and labour intensive process that was made even more challenging by the general lack of absolute reference standards. Several years of benchmarking activities in the frame of the RAdiation transfer Model Intercomparison (RAMI) exercise have now led to the development of the RAMI On-line Model Checker (ROMC). The ROMC is a web-based tool allowing model developers and users to autonomously assess the performance of canopy RT models (http://romc.jrc.ec.europa.eu/). Access to the ROMC is free and enables users to obtain both statistical and graphical indications as to the performance of their canopy RT model. In addition to providing an overall indication of the skill of a given model to correctly match the reference data, the ROMC allows also for interactive comparison/evaluations of different model versions/submissions of a given user. All ROMC graphs can be downloaded in PostScript format and come with a reference number for easy usage in presentations and publications. It is hoped that the ROMC will prove useful for the RT modeling community as a whole, not only by providing a convenient means to evaluate models outside the triennial phases of RAMI but also to attract participation in future RAMI activities.  相似文献   

18.
The classic readers-writers problem has been extensively studied. This holds to a lesser degree for the reentrant version, where it is allowed to nest locking actions. Such nesting is useful when a library is created with various procedures each starting and ending with a lock operation. Allowing nesting makes it possible for these procedures to call each other.We considered an existing widely used industrial implementation of the reentrant readers-writers problem. Staying close to the original code, we modelled and analyzed it using a model checker resulting in the detection of a serious error: a possible deadlock situation. The model was improved and checked satisfactorily for a fixed number of processes. To achieve a correctness result for an arbitrary number of processes the model was converted to a specification that was proven with a theorem prover. Furthermore, we studied starvation. Using model checking we found a starvation problem. We have fixed the problem and checked the solution. Combining model checking with theorem proving appeared to be very effective in reducing the time of the verification process.  相似文献   

19.
It is commonly observed that over the lifetime of most model predictive controllers, the achieved performance degrades over time. This effect can often be attributed to the fact that the dynamics of the controlled plant change as the plant ages, due to wear and tear, refurbishment and design changes of the plant, to name a few factors. These changes mean that re-identification is necessary to restore the desired performance of the controller. An extension of existing predictive controllers, capable of producing signals suitable for closed loop re-identification, is presented in this article. The main contribution is an extensive experimental evaluation of the proposed controller for closed loop re-identification on an industrial depropanizer distillation column in simulations and in real experiments. The plant experiments are conducted on the depropanizer during normal plant operations. In the simulations, as well as in the experiments, the updated models from closed loop re-identification result in improvement of the performance. The algorithm used combines regular model predictive control with ideas from applications oriented input design and linear matrix inequality based convex relaxation techniques. Even though the experiments show promising result, some implementation problems arise and are discussed.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号