首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This paper formalizes and analyzes cognitive transitions between artificial perceptions that consist of an analogical or metaphorical transference of perception. The formalization is performed within a mathematical framework that has been used before to formalize other aspects of artificial perception and cognition. The mathematical infrastructure consists of a basic category of ‘artificial perceptions’. Each ‘perception’ consists of a set of ‘world elements’, a set of ‘connotations’, and a three valued (true, false, undefined) predicative connection between the two sets. ‘Perception morphisms’ describe structure preserving paths between perceptions. Quite a few artificial cognitive processes can be viewed and formalized as perception morphisms or as other categorical constructs. We show here how analogical transitions can be formalized in a similar way. A factorization of every analogical transition is shown to formalize metaphorical perceptions that are inspired by the analogy. It is further shown how structural aspects of ‘better’ analogies and metaphors can be captured and evaluated by the same categorical setting, as well as generalizations that emerge from analogies. The results of this study are then embedded in the existing mathematical formalization of other artificial cognitive processes within the same premises. A fallout of the rigorous unified mathematical theory is that structured analogies and metaphors share common formal aspects with other perceptually acute cognitive processes. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

2.
Institution Morphisms   总被引:1,自引:0,他引:1  
Institutions formalise the intuitive notion of logical system, including syntax, semantics, and the relation of satisfaction between them. Our exposition emphasises the natural way that institutions can support deduction on sentences, and inclusions of signatures, theories, etc.; it also introduces terminology to clearly distinguish several levels of generality of the institution concept. A surprising number of different notions of morphism have been suggested for forming categories with institutions as objects, and an amazing variety of names have been proposed for them. One goal of this paper is to suggest a terminology that is uniform and informative to replace the current chaotic nomenclature; another goal is to investigate the properties and interrelations of these notions in a systematic way. Following brief expositions of indexed categories, diagram categories, twisted relations and Kan extensions, we demonstrate and then exploit the duality between institution morphisms in the original sense of Goguen and Burstall, and the ‘plain maps’ of Meseguer, obtaining simple uniform proofs of completeness and cocompleteness for both resulting categories. Because of this duality, we prefer the name ‘comorphism’ over ‘plain map’; moreover, we argue that morphisms are more natural than comorphisms in many cases. We also consider ‘theoroidal’ morphisms and comorphisms, which generalise signatures to theories, based on a theoroidal institution construction, finding that the ‘maps’ of Meseguer are theoroidal comorphisms, while theoroidal morphisms are a new concept. We introduce ‘forward’ and ‘semi-natural’ morphisms, and develop some of their properties. Appendices discuss institutions for partial algebra, a variant of order sorted algebra, two versions of hidden algebra, and a generalisation of universal algebra; these illustrate various points in the main text. A final appendix makes explicit a greater generality for the institution concept, clarifies certain details and proves some results that lift institution theory to this level. Received December 2000 / Accepted in revised form January 2002  相似文献   

3.
Local Invariance     
A local invariant is a set of states of a transition system with the property that every action possible from each of its states takes the system back into the local invariant, unless it is an ‘exit’ action, each of which is accessible from every state in the set via a sequence of non-‘exit’ actions. This idea supports a form of abstraction construction, abstracting away from behaviour internal to the local invariants themselves, that involving their non-‘exit’ actions. In this way, for example, it is possible to construct from a system one which exhibits precisely the externally visible behaviour. This abstraction is reminiscent of hiding in process algebra, and we compare the notion of abstraction with that of observational equivalence in process calculus. Received August 2000 / Accepted in revised form March 2002 Correspondence and offprint requests to: David H. Pitt, Department of Computing, University of Surrey, Guildford GU2 5HX, UK. E-mail: d.pitt@eim.surrey.ac.ukau  相似文献   

4.
‘Correlations without correlata’ is an influential way of thinking of quantum entanglement as a form primitive correlation which nonetheless maintains locality of quantum theory. A number of arguments have sought to suggest that such a view leads either to internal inconsistency or to conflict with the empirical predictions of quantum mechanics. Here we explicate and provide a partial defence of the notion, arguing that these objections import unwarranted conceptions of correlation properties as hidden variables. A more plausible account sees the properties in terms of Everettian relative states. The ontological robustness of entanglement is also defended from recent objections.  相似文献   

5.
The need for information technology-mediated cooperation seems obvious. However, what is not obvious is what this means and what social demands such cooperation may imply. To explore this is the intention of the paper. As a first step the paper performs an etymological analysis of the words telecooperation and telecoordination. Such an analysis indicates that cooperation happens when people engage in the production of a work as if ‘one mind or body’, where their activities fuse together in a way that makes the suggestion of separation seem incomprehensible. In the work they do not merely aim to achieve an outcome, they also ‘insert’ themselves ‘in’ the work in a way that makes it a human achievement rather than a mere product – this is cooperation as working-together. With this notion of cooperation in mind the paper then proceeds to analyse the social conditions for cooperation as working-together. It shows, using the work of Wittgenstein, that language is fundamental to cooperation and the sharing of knowledge – not language as a system for the exchange of information but language as a medium for the co-creation of a local way of doing, a local language, to capture the local distinctions that make a particular local activity significant and meaningful to the participants. The paper then proceeds to question this strong notion of cooperation. It argues that most cooperative activities tend not to conform with such stringent demands. The paper suggests that a cooperative problem is best viewed as a situation in which ambiguity is accepted as a structural element of the interaction. From this perspective the paper suggests that hermeneutics may be a productive way to understand the creation of shared interpretative spaces that makes mediated cooperation possible. The paper concludes with some implications for mediated cooperative work.  相似文献   

6.
Conclusion Four decades of sporadic invention and experimentation of and with non-traditional human-computer interface schemes have congealed (somewhat abruptly though not without a few clear-sighted antecedents) into a new field of information system design, here calledAntisedentary Beigeless Computing, that consciously rejects the traditional conception of isolated tete-a-tete between the human and the box-CRT-keyboardmouse. ABC systems instead favour the complementary directions away from this notion of an immobile info-shrine: more personal, intimate, and portable information access; and more diffuse, environmentally-integrated information access. Consideration of ABC projects to date seems to suggest that no single instance can alone express the full generality required of a ‘working’ information system, so that (on the one hand) system design must acknowledge that a complex set of trade-offs involving capabilities, universality, specificity, personalization, and generality is inescapable; while (on the other hand) an ideal, eventual ‘information environment’ will inevitably comprise the careful interweaving of some number of individual ABC systems. Taxonomies and classification schema can rarely hope to be found complete or flawless before the collection of items that they purport to describe have themselves reached the evolutionary stasis of ‘adulthood’ — that is, there is typically some threshold of development or growth beyond which few enough surprises lurk that an encompassing taxonomy can be constructed and observed to reliably encompass, in the longer term. The domain of ABC thought is still quite nascent, and so we would be foolish to assume that all its extremities of form and connotation are now visible, but to the extent that we can already see the outlines of a ‘field’ it is reasonable to make a first run at an analytic taxonomy. The ‘independent character axes’ approach presented here seems broad and loose enough to accommodate any number of additions to the basic stable of ABC systems. It is, further, a taxonomy amenable to significant revision as may be found necessary: axes can be added, deleted, reconstrued, etc. as time and consideration clarify our understanding of ABC. However, it should also be anticipated that the field will eventually coalesce around a much smaller number of better-defined ‘axes’ and thus permit taxonomic reversion to the more hierarchical (and finally more satisfying) ‘Linnean’ scheme we'd originally imagined establishing.  相似文献   

7.
We consider a reinterpretation of the rules of default logic. We make Reiter’s default rules into a constructive method of building models, not theories. To allow reasoning in first‐order systems, we equip standard first‐order logic with a (new) Kleene 3‐valued partial model semantics. Then, using our methodology, we add defaults to this semantic system. The result is that our logic is an ordinary monotonic one, but its semantics is now nonmonotonic. Reiter’s extensions now appear in the semantics, not in the syntax. As an application, we show that this semantics gives a partial solution to the conceptual problems with open defaults pointed out by Lifschitz [V. Lifschitz, On open defaults, in: Proceedings of the Symposium on Computational Logics (1990)], and Baader and Hollunder [F. Baader and B. Hollunder, Embedding defaults into terminological knowledge representation formalisms, in: Proceedings of Third Annual Conference on Knowledge Representation (Morgan‐Kaufmann, 1992)]. The solution is not complete, chiefly because in making the defaults model‐theoretic, we can only add conjunctive information to our models. This is in contrast to default theories, where extensions can contain disjunctive formulas, and therefore disjunctive information. Our proposal to treat the problem of open defaults uses a semantic notion of nonmonotonic entailment for our logic, related to the idea of “only knowing”. Our notion is “only having information” given by a formula. We discuss the differences between this and “minimal‐knowledge” ideas. Finally, we consider the Kraus–Lehmann–Magidor [S. Kraus, D. Lehmann and M. Magidor, Nonmonotonic reasoning, preferential models, and cumulative logics, Artificial Intelligence 44 (1990) 167–207] axioms for preferential consequence relations. We find that our consequence relation satisfies the most basic of the laws, and the Or law, but it does not satisfy the law of Cut, nor the law of Cautious Monotony. We give intuitive examples using our system, on the other hand, which on the surface seem to violate these two laws. We make some comparisons, using our examples, to probabilistic interpretations for which these laws are true, and we compare our models to the cumulative models of Kraus, Lehmann, and Magidor. We also show sufficient conditions for the laws to hold. These involve limiting the use of disjunction in our formulas in one way or another. We show how to make use of the theory of complete partially ordered sets, or domain theory. We can augment any Scott domain with a default set. We state a version of Reiter’s extension operator on arbitrary domains as well. This version makes clear the basic order‐theoretic nature of Reiter’s definitions. A three‐variable function is involved. Finding extensions corresponds to taking fixed points twice, with respect to two of these variables. In the special case of precondition‐free defaults, a general relation on Scott domains induced from the set of defaults is shown to characterize extensions. We show how a general notion of domain theory, the logic induced from the Scott topology on a domain, guides us to a correct notion of “affirmable sentence” in a specific case such as our first‐order systems. We also prove our consequence laws in such a way that they hold not only in first‐order systems, but in any logic derived from the Scott topology on an arbitrary domain. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

8.
The notion of a ‘symbol’ plays an important role in the disciplines of Philosophy, Psychology, Computer Science, and Cognitive Science. However, there is comparatively little agreement on how this notion is to be understood, either between disciplines, or even within particular disciplines. This paper does not attempt to defend some putatively ‘correct’ version of the concept of a ‘symbol.’ Rather, some terminological conventions are suggested, some constraints are proposed and a taxonomy of the kinds of issue that give rise to disagreement is articulated. The goal here is to provide something like a ‘geography’ of the various notions of ‘symbol’ that have appeared in the various literatures, so as to highlight the key issues and to permit the focusing of attention upon the important dimensions. In particular, the relationship between ‘tokens’ and ‘symbols’ is addressed. The issue of designation is discussed in some detail. The distinction between simple and complex symbols is clarified and an apparently necessary condition for a system to be potentially symbol, or token bearing, is introduced.  相似文献   

9.
In this paper, we address the question of how flesh and blood decision makers manage the combinatorial explosion in scenario development for decision making under uncertainty. The first assumption is that the decision makers try to undertake ‘robust’ actions. For the decision maker a robust action is an action that has sufficiently good results whatever the events are. We examine the psychological as well as the theoretical problems raised by the notion of robustness. Finally, we address the false feeling of decision makers who talk of ‘risk control’. We argue that ‘risk control’ results from the thinking that one can postpone action after nature moves. This ‘action postponement’ amounts to changing look-ahead reasoning into diagnosis. We illustrate these ideas in the framework of software development and examine some possible implications for requirements analysis.  相似文献   

10.
Heim 1983 suggested that the analysis of presupposition projection requires that the classical notion of meanings as truth conditions be replaced with a dynamic notion of meanings as Context Change Potentials. But as several researchers (including Heim herself) later noted, the dynamic framework is insufficiently predictive: although it allows one to state that, say, the dynamic effect of F and G is to first update a Context Set C with F and then with G (i.e., C[F and G] = C[F][G]), it fails to explain why there couldn’t be a ‘deviant’ conjunction and* which performed these operations in the opposite order (i.e., C[F and* G] = C[G][F]). We provide a formal introduction to a competing framework, the Transparency theory, which addresses this problem. Unlike dynamic semantics, our analysis is fully classical, i.e., bivalent and static. And it derives the projective behavior of connectives from their bivalent meaning and their syntax. We concentrate on the formal properties of a simple version of the theory, and we prove that (i) full equivalence with Heim’s results is guaranteed in the propositional case (Theorem 1), and that (ii) the equivalence can be extended to the quantificational case (for any generalized quantifiers), but only when certain conditions are met (Theorem 2).  相似文献   

11.
Answer set programming (ASP) emerged in the late 1990s as a new logic programming paradigm that has been successfully applied in various application domains. Also motivated by the availability of efficient solvers for propositional satisfiability (SAT), various reductions from logic programs to SAT were introduced. All these reductions, however, are limited to a subclass of logic programs or introduce new variables or may produce exponentially bigger propositional formulas. In this paper, we present a SAT-based procedure, called ASPSAT, that (1) deals with any (nondisjunctive) logic program, (2) works on a propositional formula without additional variables (except for those possibly introduced by the clause form transformation), and (3) is guaranteed to work in polynomial space. From a theoretical perspective, we prove soundness and completeness of ASPSAT. From a practical perspective, we have (1) implemented ASPSAT in Cmodels, (2) extended the basic procedures in order to incorporate the most popular SAT reasoning strategies, and (3) conducted an extensive comparative analysis involving other state-of-the-art answer set solvers. The experimental analysis shows that our solver is competitive with the other solvers we considered and that the reasoning strategies that work best on ‘small but hard’ problems are ineffective on ‘big but easy’ problems and vice versa.  相似文献   

12.
This paper discusses the domestication of Information and Communication Technologies (ICTs), particularly their use, in UK households reporting on research undertaken between 1998 and 2004. Issues raised are linked to the dominant discourse of the ‘digital divide’, which in the UK means engaging with ICTs in a ‘meaningful’ way to ensure the economic and social well-being of UK plc (public limited company—in the UK this refers to companies whose shares can be sold to the public. The acronym is used here ironically to indicate the motivation of the government to brand and promote the UK as a whole.). Utilising a framework of understanding digital inequality and the ‘deepening divide’, domestication theory is applied to discuss motivational, material and physical, skills and usage access in the gendered household, critically contrasting this approach to ‘smart house’ research. This qualitative enquiry contributes to the neglected area of domestication studies in Information Systems research.  相似文献   

13.
 The purpose of this paper is to propose an algorithm for external performance evaluation in the area of logistics from retailers' viewpoint under fuzzy environment. The fundamental concepts we have adopted include the factor analysis, eigenvector method, fuzzy Delphi method, fuzzy set theory and multi-criteria decision-making method. We use factor analysis to condense twenty external performance sub-criteria into six criteria to construct the hierarchical structure of external performance evaluation of distribution centers. The fuzzy Delphi method is integrated with the eigenvector method to form a set of pooled weights of the extracted criteria. The concepts of triangular fuzzy number and linguistic variables are used to assess the preference ratings of linguistic variable, ‘importance’ and ‘appropriateness’. Through the hierarchy integration, we obtain the final scores of distribution centers' performance. Then we use a revised Chang and Chen's ranking method to rank the final scores of distribution centers for choosing the best distribution center in the area of logistic management.  相似文献   

14.
A designerly critique on enchantment   总被引:1,自引:0,他引:1  
To develop the concept of user experience in HCI, McCarthy et al. introduce the notion of enchantment in interaction design. They describe five sensibilities that support exploration and evaluation in design for enchantment. In this paper, we discuss design for enchantment in light of our approach to design for interaction, called design for meaningful mediation. Based on our experiences from case studies, we argue that ‘considering the whole person with feelings, desires and anxieties’, one of the sensibilities McCarthy et al. formulate, influences the desirability and realisation of the other four sensibilities. By way of case studies, we show how we explored the link between ‘the whole person’ and desired interaction experience in a designerly way. We place enchantment in a context of other interaction experiences and demonstrate possible design techniques relevant to design for interaction experiences, including enchantment.  相似文献   

15.
Basic principles of mechanical theorem proving in elementary geometries   总被引:5,自引:0,他引:5  
At the end of 1976 and the beginning of 1977, the author discovered a mechanical method for proving theorems in elementary geometries. This method can be applied to various unordered elementary geometries satisfying the Pascalian Axiom, or to theorems not involving the concept of ‘order’ (e.g., thatc is ‘between’a andb) in various elementary geometries. In Section 4 we give the detailed proofs of the basic principles underlying this method. In Sections 2 and 3 we present the theory of well-ordering of polynomials and a constructive theory of algebraic varieties. Our method is based on these theories, both of which are based on the work of J. F. Ritt. In Section 5 we use Morley's theorem and the Pascal-conic theorem discovered by the author to illustrate the computer implementation of the method.  相似文献   

16.
By abstracting away from a particular specification language and considering a ‘specification’ to be just a set of implementations, one can define a partial order on specification languages that reflects their expressive power. In addition, one can show that there is no universal specification language that can express all such ‘specifications’. Received August 1996 / Accepted in revised form April 1998  相似文献   

17.
The problem of ‘information content’ of an information system appears elusive. In the field of databases, the information content of a database has been taken as the instance of a database. We argue that this view misses two fundamental points. One is a convincing conception of the phenomenon concerning information in databases, especially a properly defined notion of ‘information content’. The other is a framework for reasoning about information content. In this paper, we suggest a modification of the well known definition of ‘information content’ given by Dretske(Knowledge and the flow of information,1981). We then define what we call the ‘information content inclusion’ relation (IIR for short) between two random events. We present a set of inference rules for reasoning about information content, which we call the IIR Rules. Then we explore how these ideas and the rules may be used in a database setting to look at databases and to derive otherwise hidden information by deriving new relations from a given set of IIR. A prototype is presented, which shows how the idea of IIR-Reasoning might be exploited in a database setting including the relationship between real world events and database values.
Malcolm CroweEmail:
  相似文献   

18.
This paper develops a semantics with control over scope relations using Vermeulen’s stack valued assignments as information states. This makes available a limited form of scope reuse and name switching. The goal is to have a general system that fixes available scoping effects to those that are characteristic of natural language. The resulting system is called Scope Control Theory, since it provides a theory about what scope has to be like in natural language. The theory is shown to replicate a wide range of grammatical dependencies, including options for, and constraints on, ‘donkey’, ‘binding’, ‘movement’, ‘Control’ and ‘scope marking’ dependencies.  相似文献   

19.
The core cognitive ability to perceive and synthesize ‘shapes’ underlies all our basic interactions with the world, be it shaping one’s fingers to grasp a ball or shaping one’s body while imitating a dance. In this article, we describe our attempts to understand this multifaceted problem by creating a primitive shape perception/synthesis system for the baby humanoid iCub. We specifically deal with the scenario of iCub gradually learning to draw or scribble shapes of gradually increasing complexity, after observing a demonstration by a teacher, by using a series of self evaluations of its performance. Learning to imitate a demonstrated human movement (specifically, visually observed end-effector trajectories of a teacher) can be considered as a special case of the proposed computational machinery. This architecture is based on a loop of transformations that express the embodiment of the mechanism but, at the same time, are characterized by scale invariance and motor equivalence. The following transformations are integrated in the loop: (a) Characterizing in a compact, abstract way the ‘shape’ of a demonstrated trajectory using a finite set of critical points, derived using catastrophe theory: Abstract Visual Program (AVP); (b) Transforming the AVP into a Concrete Motor Goal (CMG) in iCub’s egocentric space; (c) Learning to synthesize a continuous virtual trajectory similar to the demonstrated shape using the discrete set of critical points defined in CMG; (d) Using the virtual trajectory as an attractor for iCub’s internal body model, implemented by the Passive Motion Paradigm which includes a forward and an inverse motor model; (e) Forming an Abstract Motor Program (AMP) by deriving the ‘shape’ of the self generated movement (forward model output) using the same technique employed for creating the AVP; (f) Comparing the AVP and AMP in order to generate an internal performance score and hence closing the learning loop. The resulting computational framework further combines three crucial streams of learning: (1) motor babbling (self exploration), (2) imitative action learning (social interaction) and (3) mental simulation, to give rise to sensorimotor knowledge that is endowed with seamless compositionality, generalization capability and body-effectors/task independence. The robustness of the computational architecture is demonstrated by means of several experimental trials of gradually increasing complexity using a state of the art humanoid platform.  相似文献   

20.
Encryption ‘distributing over pairs’ is a technique employed in several cryptographic protocols. We show that unification is decidable for an equational theory HE specifying such an encryption. The method consists in transforming any given problem in such a way, that the resulting problem can be solved by combining a graph-based reasoning on its equations involving the homomorphisms, with a syntactic reasoning on its pairings. We show HE-unification to be NP-hard and in EXPTIME. We also indicate, briefly, how to extend HE-unification to Cap unification modulo HE, that can be used as a tool for modeling and analyzing cryptographic protocols where encryption follows the ECB mode, i.e., is done block-wise on messages.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号