共查询到20条相似文献,搜索用时 15 毫秒
1.
David M. Russinoff 《Journal of Automated Reasoning》1985,1(2):121-139
This paper describes the use of the Boyer-Moore theorem prover in mechanically generating a proof of Wilson's theorem: for any prime p, (p-1)! and p-1 are congruent modulo p. The input to the theorem prover consists of a sequence of three function definitions and forty-two propositions to be proved. The proofs generated by the system are based on a library of lemmas relating to list manipulation and number theory, including Fermat's theorem. 相似文献
2.
Sakthi Subramanian 《Journal of Automated Reasoning》1995,15(1):69-93
This paper presents a method of representing planning domains in the Boyer-Moore logic so that we can prove mechanically whether a strategy solves a problem. Current approaches require explicit frame axioms and state constraints to be included as part of a domain specification and use a programming language for expressing strategies. These make it difficult to specify domains and verify plans efficiently. Our method avoids explicit frame axioms, since domains are specified by programming an interpreter for sequences of actions in the Boyer-Moore logic. Strategies are represented as planners, Lisp programs that take an initial state and other arguments as input and return a sequence of actions that, when executed in the given initial state, will bring about a goal state. Mechanical verification of a strategy is accomplished by proving that the corresponding planner solves all instances of the given problem. We illustrate our approach by verifying strategies in some variations of the blocks world.The work described here was supported in part by NSF Grant MIP-9017499. 相似文献
3.
This paper presents HARP, a complete, tableau-based theorem prover for first order logic, which is intended to be used both interactively and as an inference engine for Artificial Intelligence applications. Accordingly, HARP's construction is influenced by the design goals of naturalness, efficiency, usefulness in an Artificial Intelligence environment, and modifiability of the control structure by heuristics. To achieve these goals, HARP accepts the entire language of first order logic, i.e. avoids conversion to any kind of normal form, and combines a proof condensation procedure with explicitly represented, declaratively formulated heuristics to construct and communicate its proofs in a format congenial to people. The proof condensation procedure makes proof shorter and more readable by excising redundancies from proof trees. Domain-independent heuristics are formulated to capture efficient and human-like deduction strategies and to rapidly detect certain types of nontheorems. Domain-dependent heuristics can be used to implement specific control regimes, e.g. to efficiently support inheritance. HARP's architecture-and the concomitant ready extensibility of its control structure by declarative heuristic rules-renders it easy to let extralogical information, e.g. semantic and world knowledge, guide the search for proofs and help eliminate irrelevant premisses. 相似文献
4.
TABLEAUX: A general theorem prover for modal logics 总被引:1,自引:0,他引:1
Laurent Catach 《Journal of Automated Reasoning》1991,7(4):489-510
We present a general theorem proving system for propositional modal logics, called TABLEAUX. The main feature of the system is its generality, since it provides an unified environment for various kinds of modal operators and for a wide class of modal logics, including usual temporal, epistemic or dynamic logics. We survey the modal languages covered by TABLEAUX, which range from the basic one L(, ) through a complex multimodal language including several families of operators with their transitive-closure and converse. The decision procedure we use is basically a semantic tableaux method, but with slight modifications compared to the traditional one. We emphasize the advantages of such semantical proof methods for modal logics, since we believe that the models construction they provide represents perhaps the most attractive feature of these logics for possible applications in computer science and AI. The system has been implemented in Prolog, and appears to be of reasonable efficiency for most current examples. Experimental results are given in the paper, with two lists of test examples.A preliminary version of this paper appeared in the Proceedings of the International Computer Science Conference (ICSC'88), Hong-Kong, December 19–21, 1988. 相似文献
5.
Mark E. Stickel 《Journal of Automated Reasoning》1988,4(4):353-380
A Prolog technology theorem prover (PTTP) is an extension of Prolog that is complete for the full first-order predicate calculus. It differs from Prolog in its use of unification with the occurs check for soundness, the model-elimination reduction rule that is added to Prolog inferences to make the inference system complete, and depth-first iterative-deepening search instead of unbounded depthfirst search to make the search strategy complete. A Prolog technology theorem prover has been implemented by an extended Prolog-to-LISP compiler that supports these additional features. It is capable of proving theorems in the full first-order predicate calculus at a rate of thousands of inferences per second.This is a revised and expanded version of a paper presented at the 8th International Conference on Automated Deduction, Oxford, England, July 1986.This research was supported by the Defense Advanced Research Projects Agency under Contract N00039-84-K-0078 with the Naval Electronic Systems Command and by the National Science Foundation under Grant CCR-8611116. The views and conclusions contained herein are those of the author and should not be interpreted as necessarily representing the official policies, either expressed or implied, of the Defense Advanced Research Projects Agency, the National Science Foundation, or the United States government. Approved for public release. Distribution unlimited. 相似文献
6.
The job-shop scheduling problem is known to be NP-complete. The version of interest in this paper concerns an assembly line designed to produce various cars, each of which requires a (possibly different) set of options. The combinatorics of the problem preclude seeking a maximal solution. Nevertheless, because of the underlying economic considerations, an approach that yields a good sequence of cars, given the specific required options for each, would be most valuable. In this paper, we focus on an environment for seeking, studying, and evaluating approaches for yielding good sequences. The environment we discuss relies on the automated reasoning program ITP. Automated reasoning programs of this type offer a wide variety of ways to reason, strategies for controlling the reasoning, and auxiliary procedures that contribute to the effective study of problems of this kind. We view the study presented in this paper as a prototype of what can be accomplished with the assistance of an automated reasoning program. 相似文献
7.
Matt Kaufmann 《International Journal on Software Tools for Technology Transfer (STTT)》2000,3(1):13-19
The well-publicized Year 2000 problem provides interesting challenges for the remediation of noncompliant code. This paper describes some work done at EDS CIO Services, using the ACL2 theorem prover to formally verify correctness of remediation rules. The rules take into account the possibility of “flag” (non-date) values of date variables. Many of them have been implemented in an in-house tool, COGEN 2000TM, that corrects for noncompliant date-related logic in COBOL programs. 相似文献
8.
Recently Lee and Plaisted proposed a theorem-proving method, the hyperlinking proof procedure, to eliminate duplication of instances of clauses during the process of inference. A theorem prover, CLIN, which implements the procedure was also constructed. In this implementation, redundant work on literal unification checking, partial unification checking, and duplicate instance checking is performed repetitively, resulting in a large overhead when many rounds of hyperlinking are needed for an input problem. We propose a technique that maintains information across rounds in shared network structures, so that the redundant work in each hyperlinking round can be avoided. Empirical performance comparison has been done between CLIN and CLIN-net, which is the theorem prover with shared network structures, and some results are shown. Problems related to memory overhead and literal ordering are discussed.Supported by National Science Council under grants NSC 81-0408-E-110-509 and NSC-82-0408-E-110-045. A preliminary version of this paper appeared in Proceedings of International Conference on Computing and Information (Sudbury, Ontario Canada, May 1993). 相似文献
9.
We present in this article an application of automated theorem proving to a study of a theorem in combinatory logic. The theorem states: the strong fixed point property is satisfied in a system that contains the B and W combinators. The theorem can be stated in terms of Smullyan's forests of birds: a sage exists in a forest that contains a bluebird and a warbler. Proofs of the theorem construct sages from B and W. Prior to the study, one sage, discovered by Statman, was known to exist. During the study, with much assistance from two automated theorem-proving programs, four new sages were discovered. The study was conducted from a syntactic point of view because the authors know very little about combinatory logic. The uses of the automated theorem-proving programs are described in detail.This work was supported by the Applied Mathematical Sciences subprogram of the office of Energy Research, U.S. Department of Energy, under contract W-31-109-Eng-38. 相似文献
10.
Furio Di Paola 《AI & Society》1988,2(2):121-131
Examples in the history of Automated Theorem Proving are given, in order to show that even a seemingly mechanical activity, such as deductive inference drawing, involves special cultural features and tacit knowledge. Mechanisation of reasoning is thus regarded as a complex undertaking in cultural pruning of human-oriented reasoning. Sociological counterparts of this passage from human- to machine-oriented reasoning are discussed, by focusing on problems of man-machine interaction in the area of computer-assisted proof processing. 相似文献
11.
Hirosi Sugiyama 《Mathematics of Control, Signals, and Systems (MCSS)》1993,6(2):125-134
A band-limited signal is an entire function of exponential type and its frequency band is a compact set on the complex frequency plane. Hence, the sampling theorem concerns interpolation/extrapolation of an entire function. This paper presents a sampling theorem assuming a causality condition with equally spaced sampling points on the negative time axis, where the distance between two successive sampling points does not exceed the length of the Nyquist interval corresponding to the width of the frequency band in the direction of the real axis. The interpolation/extrapolation formula is constructed by using conformal mapping and Faber polynomials. An explicit formula is derived as an example, when the frequency band is on the real axis. 相似文献
12.
Bernard van Gastel 《Science of Computer Programming》2011,76(2):82-99
The classic readers-writers problem has been extensively studied. This holds to a lesser degree for the reentrant version, where it is allowed to nest locking actions. Such nesting is useful when a library is created with various procedures each starting and ending with a lock operation. Allowing nesting makes it possible for these procedures to call each other.We considered an existing widely used industrial implementation of the reentrant readers-writers problem. Staying close to the original code, we modelled and analyzed it using a model checker resulting in the detection of a serious error: a possible deadlock situation. The model was improved and checked satisfactorily for a fixed number of processes. To achieve a correctness result for an arbitrary number of processes the model was converted to a specification that was proven with a theorem prover. Furthermore, we studied starvation. Using model checking we found a starvation problem. We have fixed the problem and checked the solution. Combining model checking with theorem proving appeared to be very effective in reducing the time of the verification process. 相似文献
13.
Rescuing interfaces: A multi-year study of human-robot interaction at the AAAI Robot Rescue Competition 总被引:1,自引:0,他引:1
This paper presents results from three years of studying human-robot interaction in the context of the AAAI Robot Rescue Competition.
We discuss our study methodology, the competitors’ systems and performance, and suggest ways to improve human-robot interaction
in urban search and rescue (USAR) as well as other remote robot operations.
相似文献
Jill L. DruryEmail: |
14.
In this paper an algorithm M2RT for predicting the mean message response time (MMRT) of a communication channel is proposed with emphasis on Internet applications. The M2RT development went through four major phases. They include:
- (a) Formulating the theoretical foundation with the central limit theorem.
- (b) Determining the parameters of the algorithm by simulations.
- (c) Performing off-line verification tests for the algorithm with local Internet/Intranet nodes and well-known middleware (MPI and CORBA).
- (d) Performing on-line validation of the M2RT over the Internet involving both local and international sites.
- (a) It must perform efficiently for different conditions of workload, geography, and traffic.
- (b) It must perform consistently with the same software entities (e.g., MPI) for similar operations.
- (c) It must be able to exist both as an off-line tool and an on-line program object (to be invoked on a real-time basis).
- (d) Its computation time should be sufficiently small so that the result actually reflects the current physical conditions.
15.
Interaction with in-vehicle systems (car phones, traffic information, route guidance, etc) becomes a very difficult task since
the control devices are often reduced to small switches and push-buttons. To solve the problem, a new input interface is proposed,
based on character recognition. The paper describes in detail how a simple neural network can be applied down to the level
of an industrial realization to provide a reliable user-machine interface. The industrial application is that of character
recognition where characters are drawn with the finger on a small touchpad. Compared with the nearest neighbour method, the
neural network solution has slightly better recognition rate, is faster and requires less memory space. The design of the
recognition system is given and results of an experiment made on a driving simulator are presented. 相似文献
16.
E. Atanassov A. Karaivanova D. Vasileska 《Mathematics and computers in simulation》2010,81(3):515-521
The femtosecond dynamics of highly non-equilibrium, confined carriers is analyzed within a Monte Carlo approach. The physical process considered corresponds to a locally excited or injected into a semiconductor nanowire distribution of heated carriers, which evolve under the action of an applied electric field. The carriers are cooled down by dissipation processes caused by phonons. The process is described by a quantum-kinetic equation which generalizes the classical Boltzmann equation with respect to two classical assumptions, namely for temporal and spatial locality of the carrier-phonon interaction. We investigate the effect of the field on the electron-phonon interaction—the intra-collisional field effect (ICFE). A Monte Carlo method for simulation of the considered process has been utilized. Simulation results for carrier evolution in a GaAs nanowire are obtained and analyzed for phenomena related to the ICFE. 相似文献
17.
Molecular dynamics simulations have been performed to understand the adsorption behavior of acetone (AC) solvent at the three surfaces of 1,3,5,7-tetranitro-1,3,5,7-tetraazacyclooctan (HMX) crystal, i.e. (011), (110), and (020) faces. The simulation results show that the structural features and electrostatic potentials of crystal faces are determined by the HMX molecular packing, inducing distinct mass density distribution, dipole orientation, and diffusion of solvent molecules in the interfacial regions. The solvent adsorption is mainly governed by the van der Waals forces, and the crystal-solvent interaction energies among three systems are ranked as (020) ≈ (110) > (011). The adsorption sites for solvent incorporation at the crystal surface were found and visualized with the aid of occupancy analysis. A uniform arrangement of adsorption sites is observed at the rough (020) surface as a result of ordered adsorption motif. 相似文献
18.
Nabendu Pal Jyh-Jiuan Lin Somesh Kumar 《Computational statistics & data analysis》2007,51(12):5673-5681
For estimating the common mean of two normal populations with unknown and possibly unequal variances the well-known Graybill-Deal estimator (GDE) has been a motivating factor for research over the last five decades. Surprisingly the literature does not have much to show when it comes to the maximum likelihood estimator (MLE) and its properties compared to those of the GDE. The purpose of this note is to shed some light on the structure of the MLE, and compare it with the GDE. While studying the asymptotic variance of the GDE, we provide an upgraded set of bounds for its variance. A massive simulation study has been carried out with very high level of accuracy to compare the variances of the above two estimators results of which are quite interesting. 相似文献
19.
A novel real-time scheduler was developed to implement an interactive user interface for an existing state-of-the-art, hand-held
blood analyzer. A software-timer-based scheduler was designed and implemented and guaranteed schedulability analysis performed
to ensure that all hard execution deadlines could be met at run time. An execution bandwidth preservation mechanism that increases
the robustness and predictability of the scheduler is presented. The paper is a case study that describes the design and development
process from a point of view that emphasizes the importance of the systems integration issues that were encountered. 相似文献
20.
Edgar Meij Marc Bron Laura Hollink Bouke Huurnink Maarten de Rijke 《Journal of Web Semantics》2011,9(4):418-433
We introduce the task of mapping search engine queries to DBpedia, a major linking hub in the Linking Open Data cloud. We propose and compare various methods for addressing this task, using a mixture of information retrieval and machine learning techniques. Specifically, we present a supervised machine learning-based method to determine which concepts are intended by a user issuing a query. The concepts are obtained from an ontology and may be used to provide contextual information, related concepts, or navigational suggestions to the user submitting the query. Our approach first ranks candidate concepts using a language modeling for information retrieval framework. We then extract query, concept, and search-history feature vectors for these concepts. Using manual annotations we inform a machine learning algorithm that learns how to select concepts from the candidates given an input query. Simply performing a lexical match between the queries and concepts is found to perform poorly and so does using retrieval alone, i.e., omitting the concept selection stage. Our proposed method significantly improves upon these baselines and we find that support vector machines are able to achieve the best performance out of the machine learning algorithms evaluated. 相似文献