首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Our starting point is a definition of conditional event EH which differs from many seemingly similar ones adopted in the relevant literature since 1935, starting with de Finetti. In fact, if we do not assign the same third value u (undetermined) to all conditional events, but make it depend on EH, it turns out that this function t(EH) can be taken as a general conditional uncertainty measure, and we get (through a suitable – in a sense, compulsory – choice of the relevant operations among conditional events) the natural axioms for many different (besides probability) conditional measures.  相似文献   

2.
A well-known problem in default logic is the ability of naive reasoners to explain bothg and ¬g from a set of observations. This problem is treated in at least two different ways within that camp.One approach is examination of the various explanations and choosing among them on the basis of various explanation comparators. A typical comparator is choosing the explanation that depends on the most specific observation, similar to the notion of narrowest reference class.Others examine default extensions of the observations and choose whatever is true in any extension, or what is true in all extensions or what is true in preferred extensions. Default extensions are sometimes thought of as acceptable models of the world that are discarded as more knowledge becomes available.We argue that the notions of specificity and extension lack clear semantics. Furthermore, we show that the problems these ideas were supposed to solve can be handled easily within a probabilistic framework.  相似文献   

3.
We introduce a methodology whereby an arbitrary logic system L can be enriched with temporal features to create a new system T(L). The new system is constructed by combining L with a pure propositional temporal logic T (such as linear temporal logic with Since and Until) in a special way. We refer to this method as adding a temporal dimension to L or just temporalising L. We show that the logic system T(L) preserves several properties of the original temporal logic like soundness, completeness, decidability, conservativeness and separation over linear flows of time. We then focus on the temporalisation of first-order logic, and a comparison is make with other first-order approaches to the handling of time.  相似文献   

4.
Kumiko Ikuta 《AI & Society》1990,4(2):137-146
The role of craft language in the process of teaching (learning) Waza (skill) will be discussed from the perspective of human intelligence.It may be said that the ultimate goal of learning Waza in any Japanese traditional performance is not the perfect reproduction of the teaching (learning) process of Waza. In fact, a special metaphorical language (craft language) is used, which has the effect of encouraging the learner to activate his creative imagination. It is through this activity that the he learns his own habitus (Kata).It is suggested that, in considering the difference of function between natural human intelligence and artificial intelligence, attention should be paid to the imaginative activity of the learner as being an essential factor for mastering Kata.This article is a modified English version of Chapter 5 of my bookWaza kara shiru (Learning from Skill), Tokyo University Press, 1987, pp. 93–105.  相似文献   

5.
The adaptiveness of agents is one of the basic conditions for the autonomy. This paper describes an approach of adaptiveness forMonitoring Cognitive Agents based on the notion of generic spaces. This notion allows the definition of virtual generic processes so that any particular actual process is then a simple configuration of the generic process, that is to say a set of values of parameters. Consequently, generic domain ontology containing the generic knowledge for solving problems concerning the generic process can be developed. This lead to the design of Generic Monitoring Cognitive Agent, a class of agent in which the whole knowledge corpus is generic. In other words, modeling a process within a generic space becomes configuring a generic process and adaptiveness becomes genericity, that is to say independence regarding technology. In this paper, we present an application of this approach on Sachem, a Generic Monitoring Cognitive Agent designed in order to help the operators in operating a blast furnace. Specifically, the NeuroGaz module of Sachem will be used to present the notion of a generic blast furnace. The adaptiveness of Sachem can then be noted through the low cost of the deployment of a Sachem instance on different blast furnaces and the ability of NeuroGaz in solving problem and learning from various top gas instrumentation.  相似文献   

6.
We consider the parallel time complexity of logic programs without function symbols, called logical query programs, or Datalog programs. We give a PRAM algorithm for computing the minimum model of a logical query program, and show that for programs with the polynomial fringe property, this algorithm runs in time that is logarithmic in the input size, assuming that concurrent writes are allowed if they are consistent. As a result, the linear and piecewise linear classes of logic programs are inN C. Then we examine several nonlinear classes in which the program has a single recursive rule that is an elementary chain. We show that certain nonlinear programs are related to GSM mappings of a balanced parentheses language, and that this relationship implies the polynomial fringe property; hence such programs are inN C Finally, we describe an approach for demonstrating that certain logical query programs are log space complete forP, and apply it to both elementary single rule programs and nonelementary programs.Supported by NSF Grant IST-84-12791, a grant of IBM Corporation, and ONR contract N00014-85-C-0731.  相似文献   

7.
This paper suggests ways in which the pattern-matching capability of the computer can be used to further our understanding of stylized ballad language. The study is based upon a computer-aided analysis of the entire 595,000- word corpus of Francis James Child'sThe English and Scottish Popular Ballads (1882–1892), a collection of 305 textual traditions, most of which are represented by a variety of texts. The paper focuses on the Mary Hamilton tradition as a means of discussing the function of phatic language in the ballad genre and the significance of textual variation.Cathy Lynn Preston is a Research Associate, Computer Research in the Humanities, at the University of Colorado, Boulder. She is interested in folklore, particularly oral narrative; popular literature of the 18th- and 19th-century, particularly broadside and chapbook; the works of John Gay, Jonathan Swift, Thomas Hardy; Middle English romance and lyric. Her major publications areA KWIC Concordance to Jonathan Swift's A Tale of a Tub, The Battle of the Books, and A Discourse Concerning the Mechanical Operation of the Spirit, A Fragment, (New York: Garland Publishing, 1984) (co-authored with Harold D. Kelling), andA KWIC Concordance to Thomas Hardy's Tess of the d'Urbervilles, (New York: Garland Publishing, 1989).  相似文献   

8.
We analyze the Ohya-Masuda quantum algorithm that solves the so-called satisfiability problem, which is an NP-complete problem of the complexity theory. We distinguish three steps in the algorithm, and analyze the second step, in which a coherent superposition of states (a pure state) transforms into an incoherent mixture presented by a density matrix. We show that, if nonideal (in analogy with nonideal quantum measurement), this transformation can make the algorithm to fail in some cases. On this basis we give some general notions on the physical implementation of the Ohya-Masuda algorithm.  相似文献   

9.
This paper discusses terms which are of mutual importance to the fields of information science and computer science. Specifically we discuss the notions of information and knowledge: their interrelationships as well as their differences, and the concept of value-adding. Concrete examples are used in the discussion.Rainer Kuhlen is professor of Information Science at the University of Konstanz.  相似文献   

10.
In this paper the problem of routing messages along shortest paths in a distributed network without using complete routing tables is considered. In particular, the complexity of deriving minimum (in terms of number of intervals) interval routing schemes is analyzed under different requirements. For all the cases considered NP-hardness proofs are given, while some approximability results are provided. Moreover, relations among the different cases considered are studied.This work was supported by the EEC ESPRIT II Basic Research Action Program under Contract No. 7141 Algorithms and Complexity II, by the EEC Human Capital and Mobility MAP project, and by the Italian MURST 40% project Algoritmi, Modelli di Calcolo e Strutture Informative.  相似文献   

11.
A neural network for recognition of handwritten musical notes, based on the well-known Neocognitron model, is described. The Neocognitron has been used for the what pathway (symbol recognition), while contextual knowledge has been applied for the where (symbol placement). This way, we benefit from dividing the process for dealing with this complicated recognition task. Also, different degrees of intrusiveness in learning have been incorporated in the same network: More intrusive supervised learning has been implemented in the lower neuron layers and less intrusive in the upper one. This way, the network adapts itself to the handwriting of the user. The network consists of a 13×49 input layer and three pairs of simple and complex neuron layers. It has been trained to recognize 20 symbols of unconnected notes on a musical staff and was tested with a set of unlearned input notes. Its recognition rate for the individual unseen notes was up to 93%, averaging 80% for all categories. These preliminary results indicate that a modified Neocognitron could be a good candidate for identification of handwritten musical notes.  相似文献   

12.
Summary Many reductions among combinatorial problems are known in the context of NP-completeness. These reductions preserve the optimality of solutions. However, they may change the relative error of approximative solutions dramatically. In this paper, we apply a new type of reductions, called continuous reductions. When one problem is continuously reduced to another, any approximation algorithm for the latter problem can be transformed into an approximation algorithm for the former. Moreover, the performance ratio is preserved up to a constant factor. We relate the problem Minimum Number of Inverters in CMOS-Circuits, which arises in the context of logic synthesis, to several classical combinatorial problems such as Maximum Independent Set and Deletion of a Minimum Number of Vertices (Edges) in Order to Obtain a Bipartite (Partial) Subgraph.  相似文献   

13.
Given a finite setE R n, the problem is to find clusters (or subsets of similar points inE) and at the same time to find the most typical elements of this set. An original mathematical formulation is given to the problem. The proposed algorithm operates on groups of points, called samplings (samplings may be called multiple centers or cores); these samplings adapt and evolve into interesting clusters. Compared with other clustering algorithms, this algorithm requires less machine time and storage. We provide some propositions about nonprobabilistic convergence and a sufficient condition which ensures the decrease of the criterion. Some computational experiments are presented.  相似文献   

14.
In 1996, the author attended a seminar on ethics given by C. West Churchman at the University of California, Berkeley. During that year, the author also interviewed Churchman several times regarding the future direction of information sciences in general and the information systems research field in particular. This article is a compilation of the seminar and the interviews. It is set in the context of both Churchman's earlier and his current views of a global god, good, kindness, and caring. C. West Churchman holds that global ethics should lead to the study and design of information systems to solve large and difficult problems of the humankind such as poverty, crime and disease. His Global Ethical Management (GEM) of information sciences translates into abandoning the current goals and boundaries of the information sciences fields and changing what constitutes valid research to globally ethical endeavors.  相似文献   

15.
A new semiotic model for the generation of musical texts is introduced in this article. The idea of a generative grammar is here understood in the sense of the generative trajectory, a model elaborated by A. J. Greimas. Four levels are chosen from his trajectory for the study of musical texts, namely, those of isotopies, spatial, temporal and actorial categories, modalities and semes or figures.As an illustration, the G minor Ballade by Fr. Chopin has been examined through all these levels. The most formalized aspect of the analysis is constituted by what has been called a modal grammar of the piece (the term modality understood here in its philosophico-linguistic sense). The analysis tries to show how the musical form emerges from its inner processual traits, kinetic, epistemic and other aspects of a modal nature. It thus approaches for instance the problem of segmentation from the processual and dynamic nature of musical works. Moreover, the analysis is also an attempt to study the narrativity in music, since the narrative content of a piece like Chopin's G minor Ballade is clearly seen as the result of its modal processes.  相似文献   

16.
Continuation-passing style (CPS) is a good abstract representation to use for compilation and optimization: it has a clean semantics and is easily manipulated. We examine how CPS expresses the saving and restoring of registers in source-language procedure calls. In most CPS-based compilers, the context of the calling procedure is saved in a continuation closure—a single variable that is passed as an argument to the function being called. This closure is a record containing bindings of all the free variables of the continuation; that is, registers that hold values needed by the caller after the call are written to memory in the closure, and fetched back after the call.Consider the procedure-call mechanisms used by conventional compilers. In particular, registers holding values needed after the call must be saved and later restored. The responsibility for saving registers can lie with the caller (a caller-saves convention) or with the called function (callee-saves). In practice, to optimize memory traffic, compilers find it useful to have some caller-saves registers and some callee-saves.Conventional CPS-based compilers that pass a pointer to a record containing all the variables needed after the call (i.e., the continuation closure), are using a caller-saves convention. We explain how to express callee-save registers in Continuation-Passing Style, and give measurements showing the resulting improvement in execution time.SUPPORTED IN PART BY NSF GRANT CCR-8914570.  相似文献   

17.
Conclusion There has been no attempt in this introduction to put forward a particular method for dealing with these challenges nor to assess the full contribution of the articles in this issue. This outline and discussion has been intended merely to stimulate interdisciplinary debate and provide some of the background to assist in making this possible. A full account would at least have involved a broader review of the background of McLoughlin and Aicholzer and Schienstock in developments within the industrial sociology or industrial relations discipline. Their contributions do, however, provide a good introduction to the traditions within which they are working and so it is not necessary to provide more information here. It is nevertheless important to note in McLoughlin's case that his analysis of technological systems and system architectures is based on earlier work by McLoughlin and his colleagues, cited in the article, on the complex nature of engineering systems and the importance of taking this complexity into account in any discussion of the impact of technology on organisation. If the result of this issue is the stimulation of system designers to read further in such areas or the encouragement of industrial sociologists to become more involved in research directed towards human-oriented system design then it will have served its purpose.  相似文献   

18.
Agent-based technology has been identified as an important approach for developing next generation manufacturing systems. One of the key techniques needed for implementing such advanced systems will be learning. This paper first discusses learning issues in agent-based manufacturing systems and reviews related approaches, then describes how to enhance the performance of an agent-based manufacturing system through learning from history (based on distributed case-based learning and reasoning) and learning from the future (through system forecasting simulation). Learning from history is used to enhance coordination capabilities by minimizing communication and processing overheads. Learning from the future is used to adjust promissory schedules through forecasting simulation, by taking into account the shop floor interactions, production and transportation time. Detailed learning and reasoning mechanisms are described and partial experimental results are presented.  相似文献   

19.
This work presents a novel optimization method capable of integrating ordinal optimization (OO) and simulated annealing (SA). A general regression neural network (GRNN) is trained using available data to generate a rough model that approximates the response surface in the feasible domain. A set of good enough candidates are generated by conducting a (SA) search on this rough model. Only candidates accepted by the SA search are actually tested by evaluating their true objective functions. The GRNN model is then updated using these new data. The procedure is repeated until a specified number of tests have been performed. The method (SAOO+GRNN) is tested the well-known paper trim loss problem. SAOO+GRNN approach can substantially reduce the number of function calls and the computing time far below those of simple ordinal optimization method with such as horse race selection rule, as well as straightforward simulated annealing.  相似文献   

20.
Because a system's software architecture strongly influences its quality attributes such as modifiability, performance, and security, it is important to analyze and reason about that architecture. However, architectural documentation frequently does not exist, and when it does, it is often out of sync with the implemented system. In addition, it is rare that software development begins with a clean slate; systems are almost always constrained by existing legacy code. As a consequence, we need to be able to extract information from existing system implementations and utilize this information for architectural reasoning. This paper presents Dali, an open, lightweight workbench that aids an analyst in extracting, manipulating, and interpreting architectural information. By assisting in the reconstruction of architectures from extracted information, Dali helps an analyst redocument architectures, discover the relationship between as-implemented and as-designed architectures, analyze architectural quality attributes and plan for architectural change.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号