首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
This paper presents the Expert Operator's Associate (EOA) project, which studies the applicability of Expert Systems to day-to-day space operations. A prototype Expert System is developed, which operates on-line with an existing spacecraft control system at the European Space Operations Centre, and functions as an “operator's assistant” in controlling satellites. The prototype is demonstrated using an existing real-time simulation model of the MARECS-B2 telecommunication satellite. By developing a prototype system, it is examined to what extent the reliability and effectiveness of operations can be enhanced by AI based support. In addition, the study examines the questions of acquisition and representation of the “knowledge” for such systems, and the feasibility of “migration” of some (currently) ground-based functions into future space-borne autonomous systems.  相似文献   

2.
3.
This paper examines the need for complex, adaptive solutions to certain types of complex problems typified by the Strategic Defense System and NASA's Space Station and Mars Rover. Since natural systems have evolved with capabilities of intelligent behavior in complex, dynamic situations, it is proposed that biological principles be identified and abstracted for application to certain problems now facing industry, defense, and space exploration. Two classes of artificial neural networks are presented — a nonadaptive network used as a genetically determined “retina,” and a frequency-coded network used as an adaptive “brain.” The role of a specific environment coupled with a system of artificial neural networks having simulated sensors and effectors is seen as an ecosystem. Evolution of synthetic organisms within this ecosystem provides a powerful optimization methodology for creating intelligent systems able to function successfully in any desired environment. A complex software system involving a simulation of an environment and a program designed to cope with that environment are presented. Reliance on adaptive systems, as found in nature, is only part of the proposed answer, though an essential one. The second part of the proposed method makes use of an additional biological metaphor—that of natural selection—to solve the dynamic optimization problems every intelligent system eventually faces. A third area of concern in developing an adaptive, intelligent system is that of real-time computing. It is recognized that many of the problems now being explored in this area have their parallels in biological organisms, and many of the performance issues facing artificial neural networks may find resolution in the methodology of real-time computing.  相似文献   

4.
S. Meeran  A. Share 《Mechatronics》1997,7(8):737-756
Whether in improving quality or productivity the impact of mechatronic systems such as robots in industry is unquestionable. One aspect of interest in robotics is planning the optimum path for a mobile robot or the optimum trajectory for link movements of a stationary robot in order to increase their efficiency. However, for a given set of points complete enumeration of all the possible paths to establish an optimal one is not feasible as the search space increases exponentially (explodes combinatorially) as the number of points increases. This problem, traditionally known as the “Traveling Salesman Problem” (TSP) has attracted a great deal of attention for a long time. Proven enumerative techniques such as “nearest neighbour algorithm”, “branch and bound”, “cutting planes”, and “dynamic programming” as well as approximation methods such as “tabu search”, “greedy algorithm”, “simulated annealing” and “genetic algorithm”, have had only a limited success in solving this problem. Recently “convex hull”, a minimum area and perimeter shape, has been used as an initial sub-tour along with enumerative techniques such as minimising insertion costs to solve the TSP problem. We present a system which uses heuristic rules to augment the convex hull initial sub-tour created by the Graham scan algorithm. The system is able to provide a solution in a polynomial time.  相似文献   

5.
This paper focuses on the design of distributed control related to distributed mechanical systems. The sensors and actuators are assumed to be numerous and periodically distributed. The problem addressed in this paper is: “Can we find a way to approximate an optimal control law with a distributed electronic circuit”. Solutions to this problem are proposed in the framework of vibration control using piezoelectric actuators and sensors.  相似文献   

6.
7.
The purpose of this investigation is to demonstrate the capability of using D.U.V resist XP9493 from Shipley for E-BEAM applications. The mains parameters which have been checked are, a high resist sensitivity to get a lower exposure time coupled with a sub micron resolution and a sufficient process window. The softbake value which optimizes the dose (best throughput) to stability (process window) ratio is 100°C/60s. The Post Exposure Bake variation shows the Dose to clear (D0) to decrease when increasing the temperature, however “resist loss” becomes a problem over 125°C. The contrast is adequate for a E-beam application (γ9). The linearity measured on contacts is good in the range of 0.8μm to 2.0μm. The profile is adequate (i.e vertical) in a 1.4μm thick resist for a dose of 6μC/cm2; a higher dose would generate “uncontrolled size of contact”, a lower dose could generate “closed contacts”. The throughput earning should be 30% for the referenced implantation levels.

The resist XP9493 from SHIPLEY seems to be a good candidate for Implantation and Contact levels production application. This is the second Deep-UV positive resist tested from Shipley, on the AEBLE 150s (column 20kV), the first was the XP9402. The to-date results are the most promising ever obtained at ES2 with a positive tone resist for E-Beam applications.  相似文献   


8.
Recent incorporation of computer graphics into feature-length films — Return of the Jedi, Star Trek II: The Wrath of Khan, and Tron, to name a few — has inspired a belief that entire films might soon be generated by computer. It is shown below that even so-called “supercomputers” of today fall quite short of the power required for this goal. True supercomputers are needed with capabilities just now being conceived and with cost commensurate with typical filmmaking practice.  相似文献   

9.
10.
Artificial intelligence (AI) ideas and techniques are critical to the development of intelligent information systems that will be used to collect, manipulate, and retrieve the vast amounts of space data produced by “Missions to Planet Earth.” Natural language processing, inference, and expert systems are at the core of this space application of AI. This article presents logic programming as an AI tool, which can support inference (the ability to draw conclusions from a set of complicated and interrelated facts). It reports on the use of logic programming in the study of metadata specifications for a small problem domain of airborne sensors, and the dataset characteristics and pointers that are needed for data access.  相似文献   

11.
The current-voltage characteristics of resonant tunneling devices with well widths between 12 and 180 nm are studied. The voltage interval between the resonant peaks in the current is measured as a function of well width. For the wide wells the amplitude of the peaks in the differential conductance is modulated by an “over the barrier” interference effect involving the collector barrier. Space charge buildup and intrinsic bistability effects for a particular resonant state are found to depend critically on its energy difference from the top of the collector barrier and from lower lying standing wave states of the quantum well.  相似文献   

12.
This paper describes a behavioural competency level concerned with emergent scheduling of spacecraft payload operations. The level is part of a multilevel subsumption architecture model for autonomous spacecraft, and functions as an action selection system for processing spacecraft commands that can be considered “plans-as-communication.” Several versions of the selection mechanism are described and their robustness is qualitatively compared.  相似文献   

13.
A theoretical model for prediction of the component drift failure rate as a function of time from component parameter drift rates is described. The model assumes statistical independence of the initial value and the drift of the parameter. To use the model it is necessary to know the distribution of the initial value of the component parameter, the component parameter drift function and the distribution of the functional parameters. Further the concept of “component working lifetimes” is discussed. Two different definitions are suggested, both based on the assumptions of a Weibull distribution for the wear-out lifetimes and an exponential distribution to give the earlier “constant” failure rate.  相似文献   

14.
For general memoryless systems, the existing information-theoretic solutions have a “single-letter” form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some scalar distribution. Is that the form of the solution of any (information-theoretic) problem? In fact, some counter examples are known. The most famous one is the “two help one” problem: KÖrner and Marton showed that if we want to decode the modulo-two sum of two correlated binary sources from their independent encodings, then linear coding is better than random coding. In this paper we provide another counter example, the “doubly-dirty” multiple-access channel (MAC). Like the KÖrner–Marton problem, this is a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference while the receiver only observes the channel output. We give an explicit solution for the capacity region of the binary doubly-dirty MAC, demonstrate how this region can be approached using a linear coding scheme, and prove that the “best known single-letter region” is strictly contained in it. We also state a conjecture regarding the capacity loss of single-letter characterization in the Gaussian case.   相似文献   

15.
The globalization of telecommunicative ties between nations is studied from a heterogenization perspective. A theoretical model inspired by Appadurai’s “disjuncture hypothesis,” which stipulates that global flows of communication are multidimensional and reinforce regional/local identities, is tested empirically on an international voice traffic dataset. Spatial-statistical measures (global and local versions of Moran’s I) indicate that countries that share the same linguistic (English, Spanish, or French) or civilizational (Catholic, Protestant, and Buddhist–Hindu) background are more likely to be each other’s “telecommunicative neighbors” and that this tendency has increased over time (1989–1999).  相似文献   

16.
The present practice in space electronic activity is to use margins for electronic design called “derating” and to predict the components end of life behaviour so that the designs can fit with the worst cases of parts performances. This European Space Agency “Part Standard Specification-01-301” (Derating and End of Life performance prediction document) is being up-dated using consistent rationales based on recognized acceleration laws and Ea, and physical models. Data reviews are made to confirm the calculations of ageing drifts. Optimisation of the present rules leads to a more rigorous design and reliability tool.  相似文献   

17.
A critical issue for space operations is how to develop and apply advanced automation techniques to reduce the cost and complexity of working in space. In this context, it is important to examine how recent advances in self-processing networks can be applied for planning and scheduling tasks. For this reason, we are currently exploring the feasibility of applying self-processing network models to a variety of planning and control problems relevant to spacecraft activities. Our goals are both to demonstrate that self-processing methods are applicable to these problems, and that MIRRORS/II, a general purpose software environment for implementing self-processing models, is sufficiently robust to support development of a wide range of application prototypes. Using MIRRORS/II and marker passing modelling techniques, we implemented a model of the execution of a “Spaceworld” plan which is a simplified model of the Voyager spacecraft which photographed Jupiter, Saturn, and their satellites. This study demonstrates that plan execution, a task usually solved using traditional AI techniques, can be accomplished using a self-processing network. The fact that self-processing networks have been applied to other space-related tasks in addition to the one discussed here demonstrates the general applicability of this approach to planning and control problems relevant to spacecraft activities. This work also demonstrates that MIRRORS/II is a powerful environment for the development/evaluation of self-processing systems.  相似文献   

18.
This research effort has developed a mathematical model for bathtub shaped hazards (failure rates) for operating systems with uncensored data. The model will be used to predict the reliability of systems with such hazards. Early in the life-time of a system, there may be a relatively large number of failures due to initial weaknesses or defects in materials and manufacturing processes. This period is called the “infant mortality” period. During the middle period of an operating system fewer failures occur and are caused when the environmental stresses exceed the design strength of the system. It is difficult to predict the environmental stress amplitudes or the system strengths as deterministic functions of time, thus the middle-life failures are often called “random failures.” As the system ages, it deterioates and more failures occur. This region of failure is called the “wearout” period. Graphing these failure rates simultaneously will result in a bathtub shaped curve. The model developed for this bathtub pattern of failure takes into account all three failure regions simultaneously. The model has been validated for accuracy by using Halley's mortality table and is used to predict the reliability with both least squares and maximum likelihood estimators.  相似文献   

19.
The Data Encryption Standard (DES) is a cipher that is still used in a broad range of applications, from smartcards, where it is often implemented as a tamper-resistant embedded co-processor, to PCs, where it is implemented in software (for instance, to compute crypt(3) on UNIX platforms). To the authors’ knowledge, implementations of DES published so far are based on the straightforward application of the NIST standard. This article describes an innovative architecture that features a speed increase for both hardware and software implementations, compared to the state of the art. For example, the proposed architecture, at constant size, is about twice as fast as the state of the art for 3DES-CBC. The first contribution of this article is an hardware architecture that minimizes the computation time overhead caused by key and message loading. The second contribution is an optimal chaining of computations, typically required when “operation modes” are used. The optimization is made possible by a novel computation paradigm, called “IP representation”.  相似文献   

20.
The Jet Propulsion Laboratory's (JPL) Resource Allocation Process incorporated the decision-making software system RALPH into the planning process four years ago. The principal task of the Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data-driven, rule-based planning system, RALPH, has expanded the planning horizon from eight weeks to 10 years and has resulted in significant labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special “what if” studies.

This paper reviews the status of RALPH and focuses on important lessons learned from the creation of a highly functional design team, through an evolutionary design and implementation period, and through the fundamental changes to the process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered software, changes in the planning methodology as a result of evolving software capabilities, and creation of the ability to develop and process generic requirements to allow planning flexibility.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号