首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
The notion of obvious inference in predicate logic is discussed from the viewpoint of proof-checker applications in logic and mathematics education. A class of inferences in predicate logic is defined and it is proposed to identify it with the class of obvious logical inferences. The definition is compared with other approaches. The algorithm for implementing the obviousness decision procedure follows directly from the definition.  相似文献   

2.
This paper aims to provide a basis for renewed talk about use in computing. Four current discourse arenas are described. Different intentions manifest in each arena are linked to failures in translation, different terminologies crossing disciplinary and national boundaries non-reflexively. Analysis of transnational use discourse dynamics shows much miscommunication. Conflicts like that between the Scandinavian System Development School and the usability approach have less current salience. Renewing our talk about use is essential to a participatory politics of information technology and will lead to clearer perception of the implications of letting new systems becoming primary media of social interaction.  相似文献   

3.
T. Cox 《Virtual Reality》2000,5(4):215-222
This paper gives a broad overview of the technology and market for on-line and multiplayer computer gaming. Some economic considerations and their influence on the choice of technologies are examined. Particular attention is given to the massively-multiplayer and persistent world type of games, and the special problems that arise in these environments. Lastly, some ongoing problems are investigated, particularly the thorny issue of cheating in multiplayer games.  相似文献   

4.
This paper presents generated enhancements for robust two and three-quarter dimensional meshing, including: (1) automated interval assignment by integer programming for submapped surfaces and volumes, (2) surface submapping, and (3) volume submapping. An introduction to the simplex method, an optimization technique of integer programming, is presented. Simplification of complex geometry is required for the formulation of the integer programming problem. A method of i-j unfolding is defined which explains how irregular geometry can be realigned into a simplified form that is suitable for submap interval assignment solutions. Also presented is the processes by which submapping eliminates the decomposition of surface geometry, through a pseudodecomposition process, producing suitable mapped meshes. The process of submapping involves the creation of interpolated virtual edges, user defined vertex types and i-j-k space traversals. The creation of interpolated virtual edges is the method by which submapping automatically subdivides surface geometry. The interpolated virtual edge is formulated according to an interpolation scheme using the node discretization of curves on the surface. User defined vertex types allow direct user control of surface decomposition and interval assignment by modifying i-j-k space traversals. Volume submapping takes the geometry decomposition to a higher level by using mapped virtual surfaces to eliminate decomposition of complex volumes.  相似文献   

5.
Horst  Steven 《Minds and Machines》1999,9(3):347-381
Over the past several decades, the philosophical community has witnessed the emergence of an important new paradigm for understanding the mind.1 The paradigm is that of machine computation, and its influence has been felt not only in philosophy, but also in all of the empirical disciplines devoted to the study of cognition. Of the several strategies for applying the resources provided by computer and cognitive science to the philosophy of mind, the one that has gained the most attention from philosophers has been the Computational Theory of Mind (CTM). CTM was first articulated by Hilary Putnam (1960, 1961), but finds perhaps its most consistent and enduring advocate in Jerry Fodor (1975, 1980, 1981, 1987, 1990, 1994). It is this theory, and not any broader interpretations of what it would be for the mind to be a computer, that I wish to address in this paper. What I shall argue here is that the notion of symbolic representation employed by CTM is fundamentally unsuited to providing an explanation of the intentionality of mental states (a major goal of CTM), and that this result undercuts a second major goal of CTM, sometimes refered to as the vindication of intentional psychology. This line of argument is related to the discussions of derived intentionality by Searle (1980, 1983, 1984) and Sayre (1986, 1987). But whereas those discussions seem to be concerned with the causal dependence of familiar sorts of symbolic representation upon meaning-bestowing acts, my claim is rather that there is not one but several notions of meaning to be had, and that the notions that are applicable to symbols are conceptually dependent upon the notion that is applicable to mental states in the fashion that Aristotle refered to as paronymy. That is, an analysis of the notions of meaning applicable to symbols reveals that they contain presuppositions about meaningful mental states, much as Aristotle's analysis of the sense of healthy that is applied to foods reveals that it means conducive to having a healthy body, and hence any attempt to explain mental semantics in terms of the semantics of symbols is doomed to circularity and regress. I shall argue, however, that this does not have the consequence that computationalism is bankrupt as a paradigm for cognitive science, as it is possible to reconstruct CTM in a fashion that avoids these difficulties and makes it a viable research framework for psychology, albeit at the cost of losing its claims to explain intentionality and to vindicate intentional psychology. I have argued elsewhere (Horst, 1996) that local special sciences such as psychology do not require vindication in the form of demonstrating their reducibility to more fundamental theories, and hence failure to make good on these philosophical promises need not compromise the broad range of work in empirical cognitive science motivated by the computer paradigm in ways that do not depend on these problematic treatments of symbols.  相似文献   

6.
Modular Control and Coordination of Discrete-Event Systems   总被引:1,自引:0,他引:1  
In the supervisory control of discrete-event systems based on controllable languages, a standard way to handle state explosion in large systems is by modular supervision: either horizontal (decentralized) or vertical (hierarchical). However, unless all the relevant languages are prefix-closed, a well-known potential hazard with modularity is that of conflict. In decentralized control, modular supervisors that are individually nonblocking for the plant may nevertheless produce blocking, or even deadlock, when operating on-line concurrently. Similarly, a high-level hierarchical supervisor that predicts nonblocking at its aggregated level of abstraction may inadvertently admit blocking in a low-level implementation. In two previous papers, the authors showed that nonblocking hierarchical control can be guaranteed provided high-level aggregation is sufficiently fine; the appropriate conditions were formalized in terms of control structures and observers. In this paper we apply the same technique to decentralized control, when specifications are imposed on local models of the global process; in this way we remove the restriction in some earlier work that the plant and specification (marked) languages be prefix-closed. We then solve a more general problem of coordination: namely how to determine a high level coordinator that forestalls conflict in a decentralized architecture when it potentially arises, but is otherwise minimally intrusive on low-level control action. Coordination thus combines both vertical and horizontal modularity. The example of a simple production process is provided as a practical illustration. We conclude with an appraisal of the computational effort involved.  相似文献   

7.
This paper examines the use of a series of three low tech interactive assemblies that have been exhibited by the authors in a range of fairs, expositions and galleries. The paper does not present novel technical developments, but rather uses the low tech assemblies to help scope out the design space for CSCW in museums and galleries and to investigate the ways in which people collaboratively encounter and explore technological exhibits in museums and galleries. The bulk of the paper focuses on the analysis of the use of one interactive installation that was exhibited at the Sculpture, Objects and Functional Art (SOFA) Exposition in Chicago, USA. The study uses audio–visual recordings of interaction with and around the work to consider how people, in and through their interaction with others, make sense of an assembly of traditional objects and video technologies. The analysis focuses on the organised practices of assembly and how assembling the relationship between different parts of the work is interactionally accomplished. The analysis is then used to develop a series of design sensitivities to inform the development of technological assemblies to engender informal interaction and sociability in museums and galleries.  相似文献   

8.
The paper analyses restructuring processes occuring with the introduction of information technologies into firms in Austria and assesses how far the evidence lends support to the thesis of a fundamental change in rationalization patterns as postulated by continental industrial sociologists claiming the emergence of a novel type of systemic rationalization. Based on a research perspective putting emphasis on several levels of social mediation of technological change the broad conclusion is the following: there are clear indications of a novel systemic approach to rationalization but the associated forms of work organization show substantial variation. The analysis of the influence of national-level institutions, industry- and firm-specific conditions, and their role in micro-political processes of system and work design, points towards an underutilization of work humanization potentials and suggests an increase in skill supply as one of the possible intervention strategies.  相似文献   

9.
Modeling and programming tools for neighborhood search often support invariants, i.e., data structures specified declaratively and automatically maintained incrementally under changes. This paper considers invariants for longest paths in directed acyclic graphs, a fundamental abstraction for many applications. It presents bounded incremental algorithms for arc insertion and deletion which run in O( + || log||) time and O() time respectively, where || and are measures of the change in the input and output. The paper also shows how to generalize the algorithm to various classes of multiple insertions/deletions encountered in scheduling applications. Preliminary experimental results show that the algorithms behave well in practice.  相似文献   

10.
We address the problem of training multilayer perceptrons to instantiate a target function. In particular, we explore the accuracy of the trained network on a test set of previously unseen patterns — the generalisation ability of the trained network. We systematically evaluate alternative strategies designed to improve the generalisation performance. The basic idea is to generate a diverse set of networks, each of which is designed to be an implementation of the target function. We then have a set of trained, alternative versions — a version set. The goal is to achieve useful diversity within this set, and thus generate potential for improved generalisation performance of the set as a wholewhen compared to the performance of any individual version. We define this notion of useful diversity, we define a metric for it, we explore a number of ways of generating it, and we present the results of an empirical study of a number of strategies for exploiting it to achieve maximum generalisation performance. The strategies encompass statistical measures as well as a selectornet approach which proves to be particularly promising. The selector net is a form of metanet that operates in conjunction with a version set.  相似文献   

11.
'Racial' disparities among cancers, particularly of the breast and prostate, are something of a mystery. For the US, in the face of slavery and its sequelae, centuries of interbreeding has greatly leavened genetic differences between Blacks and Whites, but marked contrasts in disease prevalence and progression persist. Adjustment for socioeconomic status and lifestyle, while statistically accounting for much of the variance in breast cancer, only begs the question of ultimate causality. Here we propose a more basic biological explanation that extends the theory of immune cognition to include an elaborate tumor control mechanism constituting the principal selection pressure acting on pathologically mutating cell clones. The interplay between them occurs in the context of an embedding, highly structured, system of culturally-specific psychosocial stress. A rate distortion argument finds that larger system able to literally write an image of itself onto the disease process, in terms of enhanced risk behaviour, accelerated mutation rate, and depressed mutation control. The dynamics are analogous to punctuated equilibrium in simple evolutionary systems, accounting for the staged nature of disease progression. We conclude that 'social exposures' are, for human populations, far more than incidental cofactors in cancer etiology. Rather, they are part of the basic biology of the disorder. The aphorism that culture is as much a part of human biology as the enamel on our teeth appears literally true at a fundamental cellular level.  相似文献   

12.
Over the recent years, noticeable theoretical efforts have been devoted to the understanding of the role of networks' parameter spaces in neural learning. One of the contributions in this field concerns the study of weight-flows on Stiefel manifold, which is the natural parameter-space's algebraic-structure in some unsupervised (information-theoretic) learning task. An algorithm belonging to the class of learning equations generating Stiefel-flows is based on the rigid-body theory, introduced by the present Author in 1996. The aim of this Letter is to present an investigation on the capability of a complex-weighted neuron, trained by a rigid-bodies learning theory, with application to blind source separation of complex-valued independent signals for telecommunication systems.  相似文献   

13.
Résumé Nous étudions certaines propriétés des générateurs algébriques et linéaires. Nous montrons que le langage algébrique E engendré par la grammaire: S aSbSc + d domine tous les langages algébriques par applications séquentielles fidèles. Nous en déduisons que pour tout langage algébrique L et tout générateur algébrique L, il existe une transduction rationnelle fonctionnelle et fidèle telle que L=(L). Ce résultat, qui n'est pas vérifié pour la famille, Lin, des langages algébriques linéaires, nous permet de montrer qu'aucun générateur algébrique n'appartient à la famille EDTOL. Enfin, nous établissons que si L est un générateur linéaire, L # est un générateur séquentiel pour Lin.
Algebraic and linear generators
Summary We study some properties of algebraic and linear generators. We show that the algebraic language E generated by the grammar: S aSbSc + d dominates every algebraic language by faithful sequential mappings. We deduce that, for every algebraic language L and every algebraic generator L, there exists a faithful rational function such that L=(L). This result, which does not hold for the family of linear languages, permits us to show that no algebraic generator belongs to the family EDTOL. Also, we prove if L is a linear generator then L # is a sequential generator for Lin.
  相似文献   

14.
This paper is a discussion about how the Application Perspective works in practice.1 We talk about values and attitudes to system development and computer systems, and we illustrate how they have been carried out in practice by examples from the Florence project.2 The metaphors utensil and epaulet refer to questions about how we conceive the computer system we are to design in the system development process. Our experience is that, in the scientific community, technical challenges mean making computer systems that may be characterised as epaulets: they have technical, fancy features, but are not particularly useful. Making small, simple, but useful computer systems, more like utensils, does not give as much credit even if the development process may be just as challenging.  相似文献   

15.
Methods and Algorithms for Constraint-based Virtual Assembly   总被引:6,自引:0,他引:6  
Constraint-based simulation is a fundamental concept used for assembly in a virtual environment. The constraints (axial, planer, etc.) are extracted from the assembly models in the CAD system and are simulated during the virtual assembly operation to represent the real world operations. In this paper, we present the analysis of combinations and order of application of axial and planar constraints used in assembly. Methods and algorithms for checking and applying the constraints in the assembly operation are provided. An object-oriented model for managing these constraints in the assembly operation is discussed.  相似文献   

16.
Summary Tsokos [12] showed the existence of a unique random solution of the random Volterra integral equation (*)x(t; ) = h(t; ) + o t k(t, ; )f(, x(; )) d, where , the supporting set of a probability measure space (,A, P). It was required thatf must satisfy a Lipschitz condition in a certain subset of a Banach space. By using an extension of Banach's contraction-mapping principle, it is shown here that a unique random solution of (*) exists whenf is (, )-uniformly locally Lipschitz in the same subset of the Banach space considered in [12].  相似文献   

17.
The primary purpose of parallel computation is the fast execution of computational tasks that are too slow to perform sequentially. However, it was shown recently that a second equally important motivation for using parallel computers exists: Within the paradigm of real-time computation, some classes of problems have the property that a solution to a problem in the class computed in parallel is better than the one obtained on a sequential computer. What represents a better solution depends on the problem under consideration. Thus, for optimization problems, better means closer to optimal. Similarly, for numerical problems, a solution is better than another one if it is more accurate. The present paper continues this line of inquiry by exploring another class enjoying the aforementioned property, namely, cryptographic problems in a real-time setting. In this class, better means more secure. A real-time cryptographic problem is presented for which the parallel solution is provably, considerably, and consistently better than a sequential one.It is important to note that the purpose of this paper is not to demonstrate merely that a parallel computer can obtain a better solution to a computational problem than one derived sequentially. The latter is an interesting (and often surprising) observation in its own right, but we wish to go further. It is shown here that the improvement in quality can be arbitrarily high (and certainly superlinear in the number of processors used by the parallel computer). This result is akin to superlinear speedup—a phenomenon itself originally thought to be impossible.  相似文献   

18.
We present a new definition of optimality intervals for the parametric right-hand side linear programming (parametric RHS LP) Problem () = min{c t x¦Ax =b + ¯b,x 0}. We then show that an optimality interval consists either of a breakpoint or the open interval between two consecutive breakpoints of the continuous piecewise linear convex function (). As a consequence, the optimality intervals form a partition of the closed interval {; ¦()¦ < }. Based on these optimality intervals, we also introduce an algorithm for solving the parametric RHS LP problem which requires an LP solver as a subroutine. If a polynomial-time LP solver is used to implement this subroutine, we obtain a substantial improvement on the complexity of those parametric RHS LP instances which exhibit degeneracy. When the number of breakpoints of () is polynomial in terms of the size of the parametric problem, we show that the latter can be solved in polynomial time.This research was partially funded by the United States Navy-Office of Naval Research under Contract N00014-87-K-0202. Its financial support is gratefully acknowledged.  相似文献   

19.
In order to cope with the changing health needs in the community, an holistic approach on AIDS prevention and control with particular reference to essential quality was introduced at an educational seminar at Hebei Medical University in China, 1996. We have identified three major points in the present study through learning and research process: 1. The importance of cultural norm for the unification of science and technology is identified for the community approach; 2. community care emphasising human quality provides unity in diversity for educational program; and 3. community control emphasising quality assurance demonstrates the effectiveness for program analysis from the viewpoint of human centred systems.  相似文献   

20.
It is shown that the translation of an open default into a modal formula x(L(x)LM 1 (x)...LM m (x)w(x)) gives rise to an embedding of open default systems into non-monotonic logics.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号