首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Deep Blue     
《Artificial Intelligence》2002,134(1-2):57-83
Deep Blue is the chess machine that defeated then-reigning World Chess Champion Garry Kasparov in a six-game match in 1997. There were a number of factors that contributed to this success, including:
  • •a single-chip chess search engine,
  • •a massively parallel system with multiple levels of parallelism,
  • •a strong emphasis on search extensions,
  • •a complex evaluation function, and
  • •effective use of a Grandmaster game database.
This paper describes the Deep Blue system, and gives some of the rationale that went into the design decisions behind Deep Blue.  相似文献   

2.
Many engineering analysis problems that have the greatest value to customers fall outside the scope of existing off-the-shelf analysis tools. Typical reasons for existing tools not solving an engineering problem include:
  • •not being able to model the problems in physics;
  • •a coupled multidisciplinary problem;
  • •assembly level physics;
  • •a need to control the solution strategy;
  • •a need to integrate disparate existing applications.
Enter CFEtools. CFEtools is a tool for the development of CAE applications and the enhancement of existing CAE applications. CFEtools enables the development of CAE capabilities two orders of magnitude faster than would be possible with traditional tools. For the first time, CFEtools makes custom tailored analysis solutions economically feasible. Implemented as a C++ object oriented library, CFEtools offers all the advantages of high level, plug and play models with the flexibility of a modern programming language.  相似文献   

3.
QAPL'01This volume contains the Post-Proceedings of the ACM Workshop on Quantitative Aspects of Programming Languages (QAPL'01). The workshop was held in Firenze, Italy on Friday 7th September 2001, as satellite event to Principles, Logics, and Implementations of high-level programming languages, PLI'01.The majority of approaches in program semantics and analysis are arguably concentrated on qualitative investigations of various computational properties. As a result, some aspects of computation are neglected, which are of a quantitative nature. Such aspects are nevertheless important and sometimes essential in determining the behaviour of systems. As an example, issues related to resource consumption (storage, time, bandwidth, etc.) cannot be ignored when systems of interacting, competing or cooperating processes are considered.The aim of this workshop was to discuss appropriate models of programming languages, which are able to capture quantitative aspects of computation. Such models are good candidates to provide the base of new approaches in semantics and program analysis. Their investigation will hopefully not just allow for a better understanding of program behaviour (e.g. how agents compete for a limited resource), but also help to establish better connections with related complexity theoretic questions.Probabilistic languages and real-time languages constitute two important prototypical examples of quantitative models. But we encouraged the contributions covering other and/or more general quantitative aspects in the design, analysis and implementation of programming languages, with particular emphasis on the functional and declarative paradigms.The papers presented at the workshop were selected by the program committee consisting of
  • •Luca de Alfaro, University of California at Berkeley, USA
  • •Frank de Boer, University of Utrecht, The Netherlands
  • •Alessandra Di Pierro, University of Pisa, Italy
  • •Maurizio Gabbrielli, University of Udine, Italy
  • •Marta Z. Kwiatkowska, University of Birmingham, UK
  • •Herbert Wiklicky, Imperial College, UK.
At the workshop there was one invited talk by
  • •David Sands: Asymptotic Reasoning Meets Programming Language Semantics
as well as eight contributed talks:
  • •Keye Martin: Powerdomains and Zero Finding
  • •Thom Frühwirth: As Time Goes By: Complexity Analysis of Simplification Rule
  • •Antonio Brogi, Alessandra Di Pierro and Herbert Wiklicky: On Language Expressiveness and Dimension
  • •David Clark, Sebastian Hunt and Pasquale Malacaria: Quantitative Analysis of the Leakage of Confidential Data
  • •Marius C. Bujorianu and Manuela L. Bujorianu: On the Hilbert Machine Quantitative Computational Model
  • •Frank S. de Boer, Maurizio Gabbrielli and Maria Chiara Meo: Proving Correctness of Timed Concurrent Constraint Programs
  • •Nicos Angelopoulos and David R. Gilbert: A Statistical View of Probabilistic Finite Domains
  • •Richard Kieburtz: Real-time Reactive Programming for Embedded Controllers
These Post-Proceedings contain the reviewed and revised versions of the first four of these presentations.October 2002 Alessandra di Pierro Herbert Wiklicky  相似文献   

4.
We study modular properties in strongly convergent infinitary term rewriting. In particular, we show that:
  • •Confluence is not preserved across direct sum of a finite number of systems, even when these are non-collapsing.
  • •Confluence modulo equality of hypercollapsing subterms is not preserved across direct sum of a finite number of systems.
  • •Normalization is not preserved across direct sum of an infinite number of left-linear systems.
  • •Unique normalization with respect to reduction is not preserved across direct sum of a finite number of left-linear systems.
Together, these facts constitute a radical departure from the situation in finitary term rewriting. Positive results are:
  • •Confluence is preserved under the direct sum of an infinite number of left-linear systems iff at most one system contains a collapsing rule.
  • •Confluence is preserved under the direct sum of a finite number of non-collapsing systems if only terms of finite rank are considered.
  • •Top-termination is preserved under the direct sum of a finite number of left-linear systems.
  • •Normalization is preserved under the direct sum of a finite number of left-linear systems.
All of the negative results above hold in the setting of weakly convergent rewriting as well, as do the positive results concerning modularity of top-termination and normalization for left-linear systems.  相似文献   

5.
《Information Fusion》2002,3(2):103-117
The paper deals with the problem of segmentation of MRI sequences of vertebrae, in the form of images of their multiple slices, using the Dempster–Shafer theory. This leads to the study of 3-D deformations of the scoliosis. The motivation comes from the inadequacy of the existing techniques based on X-ray image analysis. Such analysis cannot deal with, on the one hand, the complex anatomical structures (“scoliotic rachis”), and the spongy tissue peri-rachidian, and, on the other hand, the choice of slices and the problem of the residue irradiation present in each examination. The main contributions of the paper are:
  • •New architecture for the fusion of MRI data sets.
  • •A novel method to exploit the information contained in MRI sequence.
  • •Model for knowledge representation adapted to specificity of information available (Dempster–Shafer theory).
  • •Choice of the discriminating parameters for the statistical expertise.
  • •Construction of the belief functions.
  • •Choice of the decision criterion.
Starting from segmentation by active contour (snake) [Deformable contour: modelling, extraction, detection and classification, Ph.D. Thesis, Wisconsin–Madison University, 1994; Proceedings of the 15th International Conference on Pattern Recognition, vol. 4, Barcelona, 2000, p. 17; Int. J. Comput. Vision 1 (3) (1987) 211], we upgrade it in an attempt to present the doctor with a degree of belief concerning their membership of the contour of the vertebra. We illustrate the proposed fusion architecture by application to actual MRI sequences of the vertebrae, and include perhaps the first example of 3-D reconstruction of the lumbar rachis starting from the results obtained during fusion.  相似文献   

6.
Anaerobic digestion provides an effective way of disposing organic material in wastewater. The EU-funded TELEMAC project aims at improving the reliability and efficiency of monitoring and control of this type of wastewater treatment plant. One of its special features is the idea of a telecontrol centre which monitors multiple, geographically distributed plants remotely, acts as a centre of expertise, and brings together the expertise of a network of remote experts. Data mining has been identified as a potentially useful contributing technology. Sensor data is now becoming available for some pilot, laboratory scale, and industrial sized digesters.This paper presents the directions of work and emerging results of data mining. Particular themes considered here include:
  • •experience gained in the data mining exercise;
  • •the use of confidence and prediction intervals;
  • •prospects for generalisation over different sizes and types of anaerobic digester;
  • •relationship to the overall supervision system developed in the project.
  相似文献   

7.
A PERT-COST type project with random activity duration is considered. The project comprises several essential parameters which practically define the quality of the project as a whole:
  • •the budget assigned to the project;
  • •the project’s due date;
  • •the project’s reliability, i.e. the probability of meeting the project’s due date on time.
To establish the quality of the project, the concept of the project’s utility is introduced. In order to maximize the project’s utility, a three-parametrical harmonization model is developed.  相似文献   

8.
《Computer Networks》1999,31(8):831-860
Secure coprocessors enable secure distributed applications by providing safe havens where an application program can execute (and accumulate state), free of observation and interference by an adversary with direct physical access to the device. However, for these coprocessors to be effective, participants in such applications must be able to verify that they are interacting with an authentic program on an authentic, untampered device. Furthermore, secure coprocessors that support general-purpose computation and will be manufactured and distributed as commercial products must provide these core sanctuary and authentication properties while also meeting many additional challenges, including:
  • •the applications, operating system, and underlying security management may all come from different, mutually suspicious authorities;
  • •configuration and maintenance must occur in a hostile environment, while minimizing disruption of operations;
  • •the device must be able to recover from the vulnerabilities that inevitably emerge in complex software;
  • •physical security dictates that the device itself can never be opened and examined; and
  • •ever-evolving cryptographic requirements dictate that hardware accelerators be supported by reloadable on-card software.
This paper summarizes the hardware, software, and cryptographic architecture we developed to address these problems. Furthermore, with our colleagues, we have implemented this solution, into a commercially available product.  相似文献   

9.
《Computers & Structures》2006,84(19-20):1264-1274
The aim of this work is to analyze the geometrically nonlinear mechanical behaviour of multilayered structures by a high order plate/shell finite element in order to predict displacements and stresses of such composite structures for design applications. Based on a conforming finite element method, a C1 triangular six node finite element is developed using trigonometric functions for the transverse shear stresses. The geometric nonlinearity is based on von-Karmann assumptions and only five generalized displacements are used to ensure:
  • •a cosine distribution for the transverse shear stresses with respect to the thickness co-ordinate, avoiding shear correction factors;
  • •the continuity conditions between layers of the laminate for both displacements and transverse shear stresses;
  • •the satisfaction of the boundary conditions at the top and bottom surfaces of the shells.
  相似文献   

10.
fms is traditionally used for production problems for which a high level of flexibility is required. These traditional fms systems are frequently controlled by a hierarchical control system implemented on a central computer. An implementation of a control system on one (big) computer has some disadvantages:
  • •• The system is vulnerable. If the central control system breaks down, the whole system will become inoperative.
  • •• It is difficult to adapt the capacity of the central computer system, when the fms is extended. It will be difficult to add control power in small steps on a ‘as needed’ basis.
Distributed fms (dfms) is a system architecture that can be implemented on a number of small control units: station managers. A complex central computer system is not needed. Advantages of the dfms concept include:
  • •• reliability
  • •• simplicity
  • •• extensibility.
The dfms concept can be used for production problems for which a limited flexibility is required (‘limited’ flexibility as opposed to the virtually unlimited flexibility, which is aimed at by designers of traditional fms).Station Managers can be implemented on programmable cncs (Computer Numerical Controls). These programmable cncs offer the user, apart from a set of defined standard functions, the possibility to add user written code. The programmable part of the cnc will be used to implement functions as pallet control, tool control and data communication. Furthermore, specific station manager tasks will be implemented. Programmable cncs will enable system integrators to implement fms control systems on user specification.An alternative implementation uses small computer systems like ibm pcs to implement the Station Manager functionality. Each pc is connected through a serial link to the cnc it controls.  相似文献   

11.
《Computers & Structures》2002,80(5-6):371-390
A three-dimensional finite volume, unstructured mesh (FV-UM) method for dynamic fluid–structure interaction (DFSI) is described. Fluid structure interaction, as applied to flexible structures, has wide application in diverse areas such as flutter in aircraft, wind response of buildings, flows in elastic pipes and blood vessels. It involves the coupling of fluid flow and structural mechanics, two fields that are conventionally modelled using two dissimilar methods, thus a single comprehensive computational model of both phenomena is a considerable challenge. Until recently work in this area focused on one phenomenon and represented the behaviour of the other more simply. More recently, strategies for solving the full coupling between the fluid and solid mechanics behaviour have been developed. A key contribution has been made by Farhat et al. [Int. J. Numer. Meth. Fluids 21 (1995) 807] employing FV-UM methods for solving the Euler flow equations and a conventional finite element method for the elastic solid mechanics and the spring based mesh procedure of Batina [AIAA paper 0115, 1989] for mesh movement.In this paper, we describe an approach which broadly exploits the three field strategy described by Farhat for fluid flow, structural dynamics and mesh movement but, in the context of DFSI, contains a number of novel features:
  • •a single mesh covering the entire domain,
  • •a Navier–Stokes flow,
  • •a single FV-UM discretisation approach for both the flow and solid mechanics procedures,
  • •an implicit predictor–corrector version of the Newmark algorithm,
  • •a single code embedding the whole strategy.
  相似文献   

12.
《Robotics and Computer》1994,11(3):121-136
This survey on expert systems activities and trends in Yugoslavia offers some results already obtained in the domain of manufacturing science and technology. In the scope of a long-term research project “Intelligent Manufacturing Systems (IMS)—Theory and Application” a Designer® Intelligent Expert System for mafacturing engineering has been proposed and partially developed. Designer® IES is based on new developed knowledge automata theory enhanced with cellular automata concept. Induction learning by analogy and Quasimorphism knowledge mapping from real world to model world is used to generate a reasoning structure. The Intelligent Expert System is divided into three main subsystems, with a very large knowledge base:
  • •Product designer
  • •Process Designer, and
  • •Production Planning and Control Designer.
All these segments were developed in pilot versions of expert systems for specific groups of activities inside each of these three domains.  相似文献   

13.
The design of data parallel algorithms for fine-grain machines is a fundamental domain in today′s computer science. High standards in the specification and resolution of problems have been achieved in the sequential case. It seems reasonable to apply the same level of quality to data parallel programs. It appears that most data parallel problems can be specified in terms of pre- and postconditions. These conditions characterize the overall state of the fine-grain processors in the initial and final states. In this paper:
  • •We present an axiomatic system to prove correctness of data parallel algorithms on single-instruction multiple-data (SIMD) machines.
  • •We specify some data parallel problems like tree sum, root finding, radix sorting, and dynamic memory allocation.
  • •With this set of axioms we prove the correctness of programs solving the problems above.
It seems that the framework for data parallel problems is quite different from those for problems of parallelism with multiple threads of control, like those solvable in Communicating Sequential Processes (CSP).  相似文献   

14.
Although many new software design techniques have emerged in the past 15 years, there have been few changes to the procedures for reviewing the designs produced using these techniques. This report describes an improved technique, based on the following ideas, for reviewing designs.
  • 1.1. The efforts of each reviewer should be focused on those aspects of the design that suit his or her experience and expertise.
  • 2.2. The characteristics of the reviewers needed should be explicitly specified before reviewers are selected.
  • 3.3. Reviewers should be asked to make positive assertions about the design rather than simply allowed to point out defects.
  • 4.4. The designers pose questions to the reviewers, rather than vice versa. These questions are posed on a set of questionnaires that requires careful study of some aspect of the design.
  • 5.5. Interaction between designers and reviewers occurs in small meetings involving two to four people rather than meetings of large groups.
Illustrations of these ideas are drawn from the application of active design reviews to the Naval Research Laboratory's Software Cost Reduction project.  相似文献   

15.
A knowledge-based system, for automatically dimensioning a given 3D CAD model, is described. It was found that
  • 1.1. dimensioning should preferably be based upon manufacturing features which are sets of surfaces in specified 3d relationships
  • 2.2. several different kinds of rules are needed
    • 2.1.(a) for each feature, rules for generation of dimensions onto sets of surfaces comprising it
    • 2.2.(b) for each surface, the generation of linear dimension forms
    • 2.3.(c) for each such form, the instantiation of the dimension onto an explicit 3d linear element
    • 2.4.(d) the selection of 3d linear elements to express in 2d views, and the selection of views to use
    • 2.5.(e) the layout of the 2d dimensions generated.
  • 3.3. a large amount of redundancy occurred, which had to be managed and retained, until final decision on selection and placement of 2D dimensions were made.
The system works by a sequence of applications of rules, which construct a set of prioritized goals connected by a network of redundancy relationships. Rules of types (a), (b) and (c) are successively applied, starting with the 3D model, to produce linear dimensioning goals, with redundancy relations among them. View selection and 2D layout involve rules of types (d) and (e), and a complex evaluation of each individual 2D dimensioning choice for each 3D linear dimensioning goal.The system was written in Prolog, using the UCLA CADLOG intelligent CAD system.  相似文献   

16.
17.
A fundamental question that must be addressed in software agents for knowledge management is coordination in multi-agent systems. The coordination problem is ubiquitous in knowledge management, such as in manufacturing, supply chains, negotiation, and agent-mediated auctions. This paper summarizes several multi-agent systems for knowledge management that have been developed recently by the author and his collaborators to highlight new research directions for multi-agent knowledge management systems. In particular, the paper focuses on three areas of research:
  • Coordination mechanisms in agent-based supply chains. How do we design mechanisms for coordination, information and knowledge sharing in supply chains with self-interested agents? What would be a good coordination mechanism when we have a non-linear structure of the supply chain, such as a pyramid structure? What are the desirable properties for the optimal structure of efficient supply chains in terms of information and knowledge sharing? Will DNA computing be a viable tool for the analysis of agent-based supply chains?
  • Coordination mechanisms in agent-mediated auctions. How do we induce cooperation and coordination among various self-interested agents in agent-mediated auctions? What are the fundamental principles to promote agent cooperation behavior? How do we train agents to learn to cooperate rather than program agents to cooperate? What are the principles of trust building in agent systems?
  • Multi-agent enterprise knowledge management, performance impact and human aspects. Will people use agent-based systems? If so, how do we coordinate agent-based systems with human beings? What would be the impact of agent systems in knowledge management in an information economy?
  相似文献   

18.
YSE and BAe Defence have recently completed an unclassified study for MoD (PE) specifically concerned with high integrity software development process and tool issues. The overall objective of the project was to produce guidance for MoD project managers on the sorts of processes and toolsets which their suppliers should be expected to deploy on projects being conducted in the context of Interim Defense Standard (IDS) 00-55 and IDS 00-56. A wide-ranging survey of industrial practice underpinned the project.This paper reports the main areas of work of the project and its results. The topics covered include:
  • •⊗ ideal high integrity software development including tool support requirements;
  • •⊗ a review of current high integrity software development practice;
  • •⊗ available tool support (COTS and proprietary);
  • •⊗ a minimum acceptable process and associated tool support;
  • •⊗ process and toolset hazard analysis;
  • •⊗ process and toolset safety cases;
  • •⊗ building better toolsets;
  • •⊗ process and toolset assessment.
  相似文献   

19.
In brief     
《Card Technology Today》2003,15(7-8):3-7
  • Schlumberger Smart Cards and Terminals
  • National Institute of Standards and Technology
  • Communications Security Establishment
  • Orga Kartensysteme
  • Giesecke & Devrient
  • INSIDE Contactless
  • Epecom Technology & Distribution
  • PT Adiwira Sembada
  • Saflink
  • European Telecommunications Standards Institute’s
  • Sagem
  • Gemplus
  • Ingenico
  • Korea Information & Communications Co
  • TDK Semiconductor Corporation
  • All American Semiconductor
  • SchlumbergerSema
  • Schlumberger Ltd
  • Tribute Ltd
  • Giesecke & Devrient
  • INSIDE Contactless
  • Beacon Technologies
  • US Navy
  • Transport for London
  • Smart Frog Holdings
  • Red Alert Group
  • NEXT
  • Trintech Group
  • Smart Card Forum of China
  • Beijing Commercial Administration
  • Schlumberger
  • Precise Biometrics
  • Microexpert
  • Visa International
  • Applied DNA Sciences
  • Applied DNA Sciences Europe Ltd
  • Biowell Technology
  相似文献   

20.
In brief     
  • Datacard Group
  • GlobalPlatform
  • Atmel Corporation
  • Giesecke & Devrient
  • Siemens Business Services
  • Macao Special Administrative Region
  • Unibanka
  • Austria Card
  • ViVOtech
  • Global Payments
  • Schlumberger Smart Cards and Terminals
  • Federation of Small Businesses
  • Datacard Group
  • Sony’s
  • SSP Solutions
  • SSP-Litronic
  • Department of Defense
  • Visa International
  • Datacard Group
  • Nissan Motor Co
  • SchlumbergerSema
  • Legic Identsystems
  • Airbus
  • Financial Supervisory Committee
  • Societe Generale
  • Schlumberger Smart Cards & Terminals
  • Visa
  • MasterCard
  • Wal-Mart, The Limited, Sears Roebuck, Safeway, Circuit City
  • Schlumberger Smart Cards and Terminals
  • Eurosmart
  • Micróelectronica Española
  • Target Corporation
  • Insight Consulting
  • ActivCard
  • Radio Frequency Investigation
  • Bell ID
  • Giesecke & Devrient
  • OTI
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号