首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 828 毫秒
1.
Most verification approaches assume a mathematical formalism in which functions are total, even though partial functions occur naturally in many applications. Furthermore, although there have been various proposals for logics of partial functions, there is no consensus on which is “the right” logic to use for verification applications. In this paper, we propose using a three-valued Kleene logic, where partial functions return the “undefined” value when applied outside of their domains. The particular semantics are chosen according to the principle of least surprise to the user; if there is disagreement among the various approaches on what the value of the formula should be, its evaluation is undefined. We show that the problem of checking validity in the three-valued logic can be reduced to checking validity in a standard two-valued logic, and describe how this approach has been successfully implemented in our tool, CVC Lite.  相似文献   

2.
Formal systems for cryptographic protocol analysis typically model cryptosystems in terms of free algebras. Modeling the behavior of a cryptosystem in terms of rewrite rules is more expressive, however, and there are some attacks that can only be discovered when rewrite rules are used. But free algebras are more efficient, and appear to be sound for “most” protocols. In [J. Millen, “On the freedom of decryption”, Information Processing Letters 86 (6) (June 2003) 329–333] Millen formalizes this intuition for shared key cryptography and provides conditions under which it holds; that is, conditions under which security for a free algebra version of the protocol implies security of the version using rewrite rules. Moreover, these conditions fit well with accepted best practice for protocol design. However, he left public key cryptography as an open problem. In this paper, we show how Millen's approach can be extended to public key cryptography, giving conditions under which security for the free algebra model implies security for the rewrite rule model. As in the case for shared key cryptography, our conditions correspond to standard best practice for protocol design.  相似文献   

3.
In this paper a new method towards automatic personalized recommendation based on the behavior of a single user in accordance with all other users in web-based information systems is introduced. The proposal applies a modified version of the well-known Apriori data mining algorithm to the log files of a web site (primarily, an e-commerce or an e-learning site) to help the users to the selection of the best user-tailored links. The paper mainly analyzes the process of discovering association rules in this kind of big repositories and of transforming them into user-adapted recommendations by the two-step modified Apriori technique, which may be described as follows. A first pass of the modified Apriori algorithm verifies the existence of association rules in order to obtain a new repository of transactions that reflect the observed rules. A second pass of the proposed Apriori mechanism aims in discovering the rules that are really inter-associated. This way the behavior of a user is not determined by “what he does” but by “how he does”. Furthermore, an efficient implementation has been performed to obtain results in real-time. As soon as a user closes his session in the web system, all data are recalculated to take the recent interaction into account for the next recommendations. Early results have shown that it is possible to run this model in web sites of medium size.  相似文献   

4.
The complete structure of an AGV control system is described in the first part of this paper. The AGV control system is hierarchical and consists of five levels. The structure of one level does not depend on the structures of the other levels. This means that the control system depends on the design of the AGV at the lowest level only, at the actuator servo-control level and its coordination in realizing AGV primitive functions.The second part of the paper describes rules applicable to AGV steering. The structure of these rules depends on two groups of factors. The first group is dependent on information groups fed to the AGV processor by the position sensor. The second group of factors represents aims and conditions and AGV steering such as positioning accuracy, positioning time, allowed room for maneuver, the shape of the given trajectory, etc. The AGV steering rules contain sequences of primitive functions. These primitive functions are of such types as “turn left”, “straighten” (correct), “go straight on”, etc. Trajectory, as one of the basic factors, is defined at the level of controlling an elementary movement. The term “to control an elementary movement” means to select a transport road throughout the transport network and to code it using “elementary movement” such as “go straight” (relating to road section), “turn left” (relating to turning at a crossroad) etc.The results of the AGV steering simulation are presented in the third part of the paper. An exact kinematic AGV model used for stimulating control models is also presented.  相似文献   

5.
This paper deals with a human-assisted knowledge extraction method to extract “if…then…” rules from a small set of machining data. The presented method utilizes both probabilistic reasoning and fuzzy logical reasoning to benefit from the machining data and from the judgment and preference of a machinist. Using the extracted rules, one can determine the values of operational parameters (feed, cutting velocity, etc.) to ensure the desired machining performance (keep surface roughness within the stipulated range (e.g., moderate)). Applying the presented method in a real-life machining knowledge extraction situation and comparing it with the inductive learning based knowledge extraction method (i.e., ID3), the usefulness of the method is demonstrated. As the concept of manufacturing automation is shifting toward “how to support humans by computers”, the presented method provides some valuable hints to the developers of futuristic computer integrated manufacturing systems.  相似文献   

6.
Clinical decision support system (CDSS) and their logic syntax include the coding of notifications (e.g., Arden Syntax). The following paper will describe the rationale for segregating policies, user preferences and clinical monitoring rules into “advanced notification” and” clinical” components, which together form a novel and complex CDSS. Notification rules and hospital policies are respectively abstracted from care-provider roles and alerting mechanisms. User-defined preferences determine which devices are to be used for receiving notifications. Our design differs from previous notification systems because it integrates a versatile notification platform supporting a wide range of mobile devices with a XML/HL-7 compliant communication protocol.  相似文献   

7.
This paper shows how an innovative “communicative” technique in teaching foreign languages—Conversation Rebuilding (CR)—readily lends itself to implementation in an Intelligent Tutoring System (ITS). Classroom language teachers using CR get students to formulate acceptable utterances in a foreign idiom by starting from rough approximations (using words the students know) and gradually zeroing in on the utterance which a native speaker of that idiom might produce in a similar setting. The ITS presented here helps students do the “zeroing in” optimally. It lets them express themselves temporarily in an “interlingua” (i.e., in their own kind of French or English or whatever they are studying), as long as they make something of their communicative intent clear, that is, as long as the System can find a semantic starting point on which to build. The ITS then prods the students to express themselves more intelligibly, starting from the “key” elements (determined by a heuristic based on how expert classroom teachers proceed) and taking into consideration the students' past successful or unsuccessful attempts at communication. To simplify system design and programming, however, conversations are “constrained”: students playact characters in set dialogs and aim at coming up with what the characters actually say (not what they could possibly say). While most Intelligent Computer Assisted Language Learning (ICALL) focuses the attention of students on norms to acquire, the ICALL implementation of CR presented in this paper focuses the attention of students on saying something—indeed, almost anything—to keep the conversation going and get some kind of meaning across to the other party. It sees successful language acquisition primarily as the association of forms with intent, not simply as the conditioning of appropriate reflexes or the elaboration/recall of conceptualized rules (which are the by-products of successful communication). Thus, in espousing this hard-line communicative approach, the present paper makes a first, non-trivial point: ICALL researchers might usefully begin by investigating what the more able teachers are doing in the classroom, rather than by building elaborate computer simulations of out-dated practices, as happens all too often. The paper then goes on to describe the architecture of a prototype ITS based on CR—one that the authors have actually implemented and tested—for the acquisition of English as a foreign language. A sample learning session is transcribed to illustrate the man-machine interaction. Concluding remarks show how the present-day limits of ICALL (and Artificial Intelligence in general) can be partially circumvented by the strategy implemented in the program, i.e. by making the students feel they are creatively piloting an interaction rather than being tested by an unimaginative machine.  相似文献   

8.
Simple analytic rules for model reduction and PID controller tuning   总被引:6,自引:0,他引:6  
The aim of this paper is to present analytic rules for PID controller tuning that are simple and still result in good closed-loop behavior. The starting point has been the IMC-PID tuning rules that have achieved widespread industrial acceptance. The rule for the integral term has been modified to improve disturbance rejection for integrating processes. Furthermore, rather than deriving separate rules for each transfer function model, there is a just a single tuning rule for a first-order or second-order time delay model. Simple analytic rules for model reduction are presented to obtain a model in this form, including the “half rule” for obtaining the effective time delay.  相似文献   

9.
10.
Diagnosis methods in debugging aim at detecting bugs of a program, either by comparing it with a correct specification or by the help of an oracle (typically, the user herself). Debugging techniques for declarative programs usually exploit the semantical properties of programs (and specifications) and generally try to detect one or more “buggy” rules. In this way, rules are split apart in an absolute way: either they are correct or not. However, in many situations, not every error has the same consequences, an issue that is ignored by classical debugging frameworks. In this paper, we generalise debugging by considering a cost function, i.e. a function that assigns different cost values to each kind of error and different benefit values to each kind of correct response. The problem is now redefined as assigning a real-valued probability and cost to each rule, by considering each rule more or less “guilty” of the overall error and cost of the program. This makes possible to rank rules rather than only separate them between right and wrong. Our debugging method is also different from classical approaches in that it is probabilistic, i.e. we use a set of ground examples to approximate these rankings.  相似文献   

11.
This paper notes that the workload of a sorting office varies from day to day, that scheduled duties are fixed in the short run, and that some overtime is usually available to deal with the excess. Without specific assumptions about the distribution of workload, it gives a means of saying what the optimum staffing level is, and how it depends on the cost of overtime, overheads associated with employing a man, and on any “penalty” for failure to sort the mail on time. For a simplified assumption about the workload distribution, it then considers some specific overtime rules and illustrates how costs and average pay per man depend on the staffing level and the manner of allocation of overtime. It suggests a sense in which the financial interests of the employer do not necessarily conflict with those of the employees.  相似文献   

12.
Discovery of unapparent association rules based on extracted probability   总被引:1,自引:0,他引:1  
Association rule mining is an important task in data mining. However, not all of the generated rules are interesting, and some unapparent rules may be ignored. We have introduced an “extracted probability” measure in this article. Using this measure, 3 models are presented to modify the confidence of rules. An efficient method based on the support-confidence framework is then developed to generate rules of interest. The adult dataset from the UCI machine learning repository and a database of occupational accidents are analyzed in this article. The analysis reveals that the proposed methods can effectively generate interesting rules from a variety of association rules.  相似文献   

13.
The problem of proving that two programs, in any reasonable programming language, are equivalent is well-known to be undecidable. In a formal programming system, in which the rules for equivalence are finitely presented, the problem of provable equivalence is semi-decidable. Despite this improved situation there is a significant lack of generally accepted automated techniques for systematically searching for a proof (or disproof) of program equivalence. Techniques for searching for proofs of equivalence often stumble on the formulation of induction and, of course, coinduction (when it is present) which are often formulated in such a manner as to require inspired guesses.There are, however, well-known program transformation techniques which do address these issues. Of particular interest to this paper are the deforestation techniques introduced by Phil Wadler and the fold/unfold program transformation techniques introduced by Burstall and Darlington. These techniques are shadows of an underlying cut-elimination procedure and, as such, should be more generally recognized as proof techniques.In this paper we show that these techniques apply to languages which have both inductive and coinductive datatypes. The relationship between these program transformation techniques and cut-elimination requires a transformation from initial and final “algebra” proof rules into “circular” proof rules as introduced by Santocanale (and used implicitly in the model checking community). This transformation is only possible in certain proof systems. Here we show that it can be applied to cartesian closed categories with datatypes: closedness is an essential requirement. The cut-elimination theorems and attendant program transformation techniques presented here rely heavily on this alternate presentation of induction and coinduction.  相似文献   

14.
Design-patterns and design-principles represent two approaches, which elicit design knowledge from successful learning environments and formulate it as design guidelines. The two approaches are fairly similar in their strategies, but differ in their research origins. This study stems from the design-principles approach, and explores how learning is affected by curriculum-materials designed according to two main design-principles: (a) engage learners in peer instruction, and (b) reuse student artifacts as resource for further learning. These principles were employed in three higher-education courses and examined with 385 students. Data analysis was conducted in two trajectories: In the “bird’s eye view” trajectory we used a “feature” unit of analysis to illustrate how learning was supported by features designed according to the two design-principles in each of the courses. In the “design-based research” trajectory we focused on one feature, a web-based Jigsaw activity, in a philosophy of education course, and demonstrated how it was refined via three design iterations. Students were required to specialize in one of three philosophical perspectives, share knowledge with peers who specialized in other perspectives, and reuse the shared knowledge in new contexts. Outcomes indicated that the design in the first iteration did not sufficiently support student ability to apply the shared knowledge. Two additional design-principles were employed in the next iterations: (c) provide knowledge representation and organization tools, and (d) employ multiple social-activity structures. The importance of combining several design-principles for designing curricular materials is discussed in terms of Alexander’s design-pattern language and his notion of referencing between design-patterns.  相似文献   

15.
Nowadays, typical software and system engineering projects in various industrial sectors (automotive, telecommunication, etc.) involve hundreds of developers using quite a number of different tools. Thus, the data of a project as a whole is distributed over these tools. Therefore, it is necessary to make the relationships of different tool data repositories visible and keep them consistent with each other. This still is a nightmare due to the lack of domain-specific adaptable tool and data integration solutions which support maintenance of traceability links, semi-automatic consistency checking as well as update propagation. Currently used solutions are usually hand-coded one-way transformations between pairs of tools. In this article we present a rule-based approach that allows for the declarative specification of data integration rules. It is based on the formalism of triple graph grammars and uses directed graphs to represent MOF-compliant (meta) models. As a result we give an answer to OMG's request for proposals for a MOF-compliant “queries, views, and transformation” (QVT) approach from the “model driven application development” (MDA) field.  相似文献   

16.
Traditional overrun handling approaches for real-time systems enforce some isolation property at the job or task level. This paper shows that by “relaxing” task isolation, it is possible to efficiently deal with overruns in soft real-time systems with highly variable task execution times and proposes Randomized Dropping (RD), a novel overrun handling mechanism. RD is able to bound task overruns in a probabilistic manner, thus providing “soft” task isolation. The paper shows how to combine RD with priority-driven and rate-based scheduling algorithms, and how to analyze the resulting system. Performance evaluation and comparison between simulation and analytical results are discussed.  相似文献   

17.
This research contributes to the theoretical basis for appropriate design of computer-based, integrated planning information systems. The research provides a framework for integrating relevant knowledge, theory, methods, and technology. Criteria for appropriate system design are clarified. The requirements for a conceptual system design are developed based on “diffusion of innovation” theory, lessons learned in the adoption and use of existing planning information systems, current information-processing technology (including expert system technology), and methodology for evaluation of mitigation strategies for disaster events. Research findings focus on the assessment of new information systems technology. Chief among these findings is the utility of case-based reasoning for discovering and formalizing the meta rules needed by expert systems, the role of the “diffusion of innovation” theory in establishing design criteria, and the definition of client interests served by integrated planning information systems. The work concludes with the selection of a prototyping exercise. The prototype is developed in a forthcoming technical paper (Masri & Moore, 1994).  相似文献   

18.
Induction based fluidics (IBF), a new, simple patented approach for transporting liquids in the micro and the macro world, is discussed. Electric fields are shown to energize liquid/s in a container/s to execute an array of purposes. IBF is shown uniquely to energize N liquids in simple off the shelf devices, inductively. We discuss calibration and other issues, as we demonstrate how simple devices can dispense nanoliters and microliters with high precision and accuracy. Furthermore, we show preliminary single and eight channel data for the Zip Tip™ made by Millipore where the device transports liquids “electrically.” We briefly consider how such new devices, “electric” Zip Tips™, might automate desalting and the placement of digests for MALDI TOF analysis.  相似文献   

19.
This paper presents a technique to improve the accuracy of the predictions obtained using the Rough Set Theory (RST) in non-deterministic cases (rough cases). The RST is here applied to the data collected by the Intelligent Field Devices for identifying predictive diagnostic algorithms for machinery, plants, subsystems, or components. The data analysis starts from a historical data set recorded from the field instruments, and its final result is a set of “if–then” rules identifying predictive maintenance functions. These functions may be used to predict if a component is going to fail or not in the next future. The prediction is obtained by applying the rules extracted with the RST algorithm on the real-time values transmitted by the field device. It may happen that some diagnoses are uncertain, in the sense that it is not possible to take a certain decision (device sound or close to fail) with a given set of data. In this paper, a new algorithm for increasing the confidence in these uncertain cases is presented. To show an example, the proposed confirmation algorithm is applied to the predictive algorithms obtained for an intelligent pressure transmitter.  相似文献   

20.
In this paper, we approach the problem of automatically designing fuzzy diagnosis rules for rotating machinery, which can give an appropriate evaluation of the vibration data measured in the target machines. In particular, we explain the implementation to this aim and analyze the advantages and drawbacks of two soft computing techniques: knowledge-based networks (KBN) and genetic algorithms (GA). An application of both techniques is evaluated on the same case study, giving special emphasis to their performance in terms of classification success and computation time.A reduced version of this paper first appeared under the title “A comparative assessment on the application of knowledge-based networks and genetic algorithms to the design of fuzzy diagnosis systems for rotating machinery”, published in the book “Soft Computing in Industry—Recent Appliactions” (Springer Engineering).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号