首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
With the growth of factory automation, the need for off-line robot programming is increasing rapidly. Off-line programming requires a robot simulator. This is the reason for the development of a TIPS/GS (Geometric Simulator), accompanied by a robot simulator. TIPS/GS has been developed as a project in the TIPS Research Association. The goal of this project is to extend the functions and applications of the solid modeler TIPS-1. Four simulators (i.e. the assembly simulator, engineering, NC simulator and robot simulator) have been developed for these extended applications.

The robot simulator described in this paper has the following special features:

• • When a robot motion is prescribed by the VAL-G language, the result can be seen on a CRT display in several patterns.

• • High-speed dynamic display which can almost keep up with real-time movements.

• • A shaded as well as wire-frame picture is used for the high-speed display entioned above.

• • Supported by the solid modeler, any robot and environment can be used with this system.

• • The preparation of a precise interference checker based on an analytical methods.

This paper is a report on the development of the robot simulator.  相似文献   


2.
《Robotics and Computer》1988,4(3-4):317-333
This paper discusses the initial development of a machine tool and its structure (concept, calculation, design) and the verification of the prototype. The topics studied include two issues: static rigidity and dynamic stability. For static rigidity several experiments and modelling studies using the finite element method have been carried out in order to identify the model parameters. In this way differences between models of bolted joints, slideways and the cross-section of the structural elements have been determined. The model is formed by design documentation and later verified through experiments on the prototype of the machine. The approach is different in the case of dynamic stability. The model is not made on the basis of design documentation or static calculations, but by experiments performed on the prototype. This relates to an oriented transfer function; parameters are determined by fitting experimental transfer function curves. With this model, the stability is analyzed under different machining conditions. Specific features of this methodology are as follows:

• • The finite element method is used for qualitative comparison of different machine tool structure concepts during the conceptual and design stages. Only after completion of the prototype may the parameters of the prototype model be adjusted for the purpose of obtaining quantitative indicators.

• • Dynamics are analyzed by parameter identification of the oriented transfer function model. The dominant degree of freedom is naturally selected by experiment and not from hypotheses about the behavior of structures obtained from mathematical manipulations such as expansion of the model according to the finite element method. If necessary another machine tool structure may be modelled; in this way hypotheses are drawn about the stability of the reconstructed prototype.

Such a procedure has been applied and verified on the machine tool structure of a horizontal machining center. Results for static rigidity and dynamic stability have been obtained from the model and experiments performed on the prototype. The following techniques have been used:

• • finite element method for qualitative identification of static behavior,

• • self-excitation of the machine,

• • digital signal processing on the FFT basis,

• • smoothing of curves and digital filtration,

• • function fitting of the transfer function (modal analysis),

• • coefficient calculus and oriented transfer function,

• • stability assessment of the fitted model under different machining conditions, and

• • modelling of the regenerative machining effect by cutting.

Necessary tests have been done by instruments required for the use of the above techniques.

Such a combined static-dynamic criteria procedure for structuring a machine tool enables efficient follow-up of all results and facilitates necessary future expansion, the utilization of universal equipment, the combination of modelling and experiments, and the synthesis of simple models of the examined machine with behavior identical to the machine. The well-known machining system dynamic stability theories are applied to such models.  相似文献   


3.
Methodologies already exist for information systems analysis and design (e.g. SSADM, JSP, Merise, etc.) and supporting tools, namely, CASE (Computer Aided Software Engineering) and RDBMS (Relational Database Management System) and/or 4GL tools. All of these tools contain a data dictionary at the core of certain facilities.

In the underlying research and in this paper, the following questions need to be addressed:

• —How can the capability of a recently available data dictionary be enhanced with some knowledge-based modules?

• —What would be the architecture of such a system, based on the data dictionary of some CASE tools?

• —How can the informal and formal modelling approach information system design be combined?

• —What sort of knowledge-representation techniques would be suitable for the different tasks during the analysis and the design of the system?

The system outlined here would work as an intelligent assistant and workbench supporting the developer, but not as an automatic programming environment.  相似文献   


4.
Employing an updated C2MOS (clocked CMOS) technique, two types of speech synthesizer LSI circuits, based on the Parcor (partial correlation) and the ADM (adoptive delta modulation) methods, and recording watch system, are introduced and described.

These LSI circuits and system have several functions

• • The new Parcor LSI circuit has the circuits needed by the Parcor synthesis algorithm. It has a 64 kbit speech data ROM, output low pass filter and preamplifier. Using only this LSI circuit, 30 s to 60 s of speech can be synthesized.

• • The new ADM LSI circuit has encoding and decoding circuits, a 64 kbit speech data ROM and RAM control circuit. The record and synthesis system can be easily constructed with this LSI circuit and RAM.

• • The recording watch system consists of the watch LSI circuit with the ADM system and the analogue LSI circuit.

In the Parcor system, various high quality and low data-rate speech outputs are obtainable. The ADM system is applied for recording and synthesizing. By applying these systems to meet market needs, it is possible to achieve good cost performance in a simple system.  相似文献   


5.
A pilot study has been conducted to prepare for an epidemiological study into the effects of vibration on tractor drivers. The reason for the study is that, despite a large volume of existing data on health and vibration (over 43,000 exposed workers having been studied) there is still no good basis for estimating dose-response relationships between vibration and health. There have been three main difficulties:

• - Many studies have not measured vibration, or have not measured it thoroughly enough.

• - Confounding variables such as posture have not been entirely controlled for, because well-matched control groups do not exist.

• - Health data have tended to be binary incidence data rather than continuous scale data, reducing the sensitivity of the studies.

The pilot study has produced procedures to overcome these difficulties.  相似文献   


6.
The development of an FDBS is integrate existing CIM components by using a bottom-up development process. The components used in this paper do not support any kind database management. The integration of those components into a federation may be done by using two general approaches [3]:

• • Migration of the files to a DBMS

• • Extend the file system to support DBMS-like features

Both migration and extension of the file system are costly solutions and actually depend on existing capabilities of the components. Problems may occur when the federated schema becomes too large. The schema might be split up into smaller federated schemes (loosely coupled FBDS).  相似文献   


7.
Stabilizing the visual system is a crucial issue for any sighted mobile creature, whether it will be natural or artificial. The more immune the gaze of an animal or a robot is to various kinds of disturbances (e.g., those created by body or head movements when walking or flying), the less troublesome it will be for the visual system to carry out its many information processing tasks. The gaze control system that we describe in this paper takes a lesson from the Vestibulo-Ocular Reflex (VOR), which is known to contribute to stabilizing the human gaze and keeping the retinal image steady. The gaze control system owes its originality and its high performances to the combination of two sensory modalities, as follows:
• a visual sensor called Optical Sensor for the Control of Autonomous Robots (OSCAR) which delivers a retinal angular position signal. A new, miniature (10 g), piezo-based version of this visual sensor is presented here;

• an inertial sensor which delivers an angular head velocity signal.

We built a miniature (30 g), one degree of freedom oculomotor mechanism equipped with a micro-rate gyro and the new version of the OSCAR visual sensor. The gaze controller involves a feedback control system based on the retinal position error measurement and a feedforward control system based on the angular head velocity measurement. The feedforward control system triggers a high-speed “Vestibulo-ocular reflex” that efficiently and rapidly compensates for any rotational disturbances of the head. We show that a fast rotational step perturbation (3° in 40 ms) applied to the head is almost completely (90%) rejected within a very short time (70 ms). Sinusoidal head perturbations are also rapidly compensated for, thus keeping the gaze stabilized on its target (an edge) within a 10 times smaller angular range than the perturbing head rotations, which were applied here at frequencies of up to 6 Hz in an amplitude range of up to 6°. This high standard of performance in terms of head rotational disturbance rejection is comparable to that afforded by the human vestibulo-oculomotor system.  相似文献   


8.
There are numerous methods available to measure the slip-resistance of different floor-coverings. The INRS has developed two distinct methods for the evaluation of the slip resistance of a given surface within the framework of its studies on the prevention of slips:

• - One method that can be used to compare new surfaces. It uses a static device developed at the INRS and it is based on the evaluation of a coefficient of dynamic friction between a sample of a new oiled surface and an elastomer. This method is well-adapted to the needs of standardisation work;

• - Another method that can be used to evaluate slippage in the field where the surfaces are often worn and polluted with a specific product. It uses a portable device developed in Sweden and it is based on the continuous evaluation of a coefficient of dynamic friction over a variable distance between the surface to be tested and an elastomer.

These two methods which present well-correlated results are described in this publication, and their distinctly different uses will be underlined.  相似文献   


9.
Naira is a compiler for Haskell, written in Glasgow parallel Haskell. It exhibits modest, but irregular, parallelism that is determined by properties of the program being compiled, e.g. the complexity of the types and of the pattern matching. We report four experiments into Naira's parallel behaviour using a set of realistic inputs: namely the 18 Haskell modules of Naira itself. The issues investigated are:

• Does increasing input size improve sequential efficiency and speedup?

• To what extent do high communications latencies reduce average parallelism and speedup?

• Does migrating running threads between processors improve average parallelism and speedup at all latencies?

 

Corresponding author; email: sahalu@ccse.kfupm.edu.sa  相似文献   


10.
Applications of power series in computational geometry   总被引:2,自引:0,他引:2  
A number of algorithms are presented for obtaining power series expansions of curves and surfaces at a point. Some results on the radius of convergence are given. Two applications of series are given:

1. • for curve tracing algorithms, where a truncated series is used to approximate the curve of intersection of two surfaces

2. • to define nth degree geometric continuity, for arbitrary

Author Keywords: power series; curve; surface; intersection problems; curve tracing; geometric continuity  相似文献   


11.
Certification of avionics software is an increasingly important subject, since more and more avionics systems in future aircraft will be software equipped. The DO-17813 standard provides guidelines for software certification. Re-use of software is emerging, partly enabled by the integrated modular avionics concept, and imposed by a reduction of life-cycle costs. Re-use, however, requires re-certification or certification of software that was not developed according to DO-17813.

The DO-178B standard is specially developed to provide a certification basis for avionics software, without going into details of the software development process. Other standards focus on software engineering aspects. We have used the DO-178B standard as a common basis for comparison with DOD-STD2167A (military), ESA PSS-05-0 (space), and IEC65A(Secretariat)122 (industry). Comparison topics include:

• • life cycles;
• • prescribed documentation;
• • configuration management;
• • verification and validation;
• • quality assurance.
All standards prescribe the software development process, emphasizing specific aspects in a certain area of interest. The results of our investigation will assist in understanding the rationale behind several standards, and can be used for:
• • certification according to DO-17813 of software that was developed using another standard;
• • certification of software using DO-17813, in concert with another standard.
  相似文献   

12.
This article describes the advantages and inconveniences with a finite element programming system, i.e. blocks of routines already thoroughly tested, which has to be built together by a programmer to a finite element program. This program may be a tailor-made program to fit a special problem or a general purpose finite element program.

The programming system used as an example in this article consists of

1. *NORSAM—finite element programming system

2. *DASA — pre- and postprocessors

3. *ELLIB—element library

Together they form a complete set of subroutines from datageneration through the necessary routines for matrix manipulation to presentation of results, including the multilevel superelement technique.

Reference to finite element programs applying the programming system concept, is given at the end of the article. Among others, programs for buckling, elasto-plastic analysis of 3-dimensional membranes and solids, nonlinear pipeline problems, acoustic field problems and transient heat conduction in solids are developed. The multilevel superelement technique has been applied in several of these application programs.

The concept of the programming system gives undoubtedly a large saving of time and resources and has proved to be more reliable than conventional methods when developing finite element programs.  相似文献   


13.
The ARIANE launcher post mission analysis is done at ARIANESPACE. This activity is called the ‘level 0 post flight analysis’ (PFA) and is carried out after each launch by about 60 engineers who are working together under the leadership of ARIANESPACE.

The PFA is one of the most critical of ARIANE operations, for several reasons:

• - The launch rate (8 a year for ARIANE 4) leaves a very short time to carry out all the verification work. Moreover, the PFA is a mandatory step before authorizing the next launch.
• - The complexity of the ARIANE launcher results in a very high demand on the PFA engineers. Moreover, there are problems of availability of people with relevant expert knowledge (characterized by a substantial staff turn-over during the 10 year life duration of ARIANE 4) which could potentially result in errors or omissions.

It is very important to be able to take into account the experience of the preceding flights and to record the results and the knowledge accumulated for each launch.

• - The quality and the reliability of the PFA mainly depends on the accessibility of data and on the used methodology.

Because the PFA is still largely done manually, and does not benefit from improved methodologies and advanced technologies providing computerized support for data processing and diagnosis, ARIANESPACE has sponsored MATRA ESPACE for the development of a knowledge based system, called ARIANEXPERT, for supporting the PFA activity. This system combines AI techniques and numerical analysis techniques, together with advanced graphical capabilities.

A prototype has been delivered in April 1990 and has been used since 6 months by ARIANESPACE during real PFAs. Several lessons have been drawn from this operational experience and are described in this paper. They concern:

• - The utility and justification of the use of AI techniques mostly coming from the explanation capabilities and the stress put on capturing the expert knowledge.
• - The difficulties associated with the integration of such systems in the exploitation of ARIANE due to the introduction of very new tasks.
• - The user point of view which evolved from reluctant to convinced.
  相似文献   

14.
Artificial Intelligence is generally recognised as one of the key technologies for future spaceflight, and a number of ambitious applications for on-board use have been proposed already. Such applications still require a good deal of basic research and development, but on-ground applications could make an impact already in the medium term, and will for some time represent the major part of AI use for space missions.

ESA has started the development of a future integrated and mission-independent spacecraft control data processing system called the Advanced Technology Operations System (ATOS) at the European Space Operations Centre, which will employ artificial intelligence techniques in supporting the operations staff during all mission preparation and implementation phases, in order to cope reliably with complex mission operations and to achieve optimal efficiency in the use of human resources.

ATOS will consist of a number of knowledge based software modules, such as

• Automated mission planning
• Automated operations preparation
• Computer assisted operations
• Advanced operator training,

centred around a Mission Information Base configured for the particular satellite mission, the common data repository for all information required to conduct the mission and operate the spacecraft.

The Mission Information Base will, in addition to numerical data presently found in conventional spacecraft control systems, contain a large amount of ‘knowledge’ about the spacecraft and its mission, which is currently available only in paper documents or embedded in software. It will be implemented as a physically and logically distributed set of databases each representing a particular field of mission information, such that the knowledge can be dynamically shared between different intelligent spacecraft control applications.  相似文献   


15.
16.
The use of computers in actual system applications is increasing with the availability of intelligent terminals on the shop floor. These terminals can be used by management as tools in the decision making process of planning shop floor operation. This paper discusses a pilot simulation study in the use of conventional Fortran-based simulation programs by shop floor management to:

1. 1. Participate in the evaluation of proposed FMS systems,

2. 2. Assess the impact of FMS acquisition on existing facilities,

3. 3. Assist in the identification of operational alternatives in “bottle neck” situations.

The pilot study employs a batch-oriented MRP system to provide daily updates of outstanding production center loadings on a monthly planning horizon. Two intelligent terminals are used to access a mini computer facility that executes the simulation models. The terminals have AT-compatible capabilities and are also used as data acquisition devices that support the numerically controlled operations within each work center.

The simulation models represent the 13 work centers of the firm and provide information about the average utilization of each work center, the number of parts in each queue and the average delay of parts in the queues. Future extensions of the models are planned to utilize the terminals' graphic animation capabilities to display the flow of production orders through the manufacturing facility.  相似文献   


17.
18.
The synthesis methodology developed by Kimura (1985) based on the design theory of output regulators essentially due to Wonham (1974) has been applied successfully to the flatness control system for a 6-high cold rolling mill. The system has the following remarkable features.

1. (1) The structure of the controller is simple. This makes it easy to tune the control system.

2. (2) The controller copes well with the detection time delay, and thus high performance is obtained even at a low rolling speed.

3. (3) The flatness error caused by the rolling force variation in mill acceleration and deceleration time would be kept to a minimum by the function to adjust roll bending force using the signal of rolling force.

Author Keywords: Multivariable systems; Flatness control; Rolling mills; Observers  相似文献   


19.
This paper describes the different steps followed in implementing a temperature controller and a supervisory controller in a Supervisory Control And Data Acquisition (SCADA) system to control a 60 and 160 l reactor in a pharmaceutical factory. These reactors are used to produce the initial quantities of drugs needed to perform clinical tests. The chemical reactions involved are often changed (several times a week). The different steps of the project will be presented:

• design of a new heating–cooling system,
• design of the predictive and Supervisory Control Algorithm (SCA),
• connection of the SCA to the SCADA system,
• experimental validation according to Good Manufacturing Practice (GMP).
This SCA has been in daily use as the standard control system for more than 2 years.  相似文献   

20.
Combinatorial structure of visibility is probably one of the most fascinating and interesting areas of engineering and computer science. The usefulness of visibility graphs in computational geometry and robotic navigation problems like motion planning, unknown-terrain learning, shortest-path planning, etc., cannot be overstressed. The visibility graph, apart from being an important data structure for storing and updating geometric information, is a valuable mathematical tool in probing and understanding the nature of shapes of polygonal and polyhedral objects. In this research we wish to initially focus our attention on a fundamental class of geometric objects. These geometric objects may be looked upon as building blocks for more complex geometric objects, and which offer an ideal balance between complexity and simplicity, namely simple polygons.

A major theme of the proposed paper is the investigation of the combinatorial structure of the visibility graph. More importantly, the goals of this paper are:

1. (i) To characterize the visibility graphs of simple polygons by obtaining necessary and sufficient conditions a graph must satisfy to qualify for the visibility graph of a simple polygon

2. (ii) To obtain hierarchical relationships between visibility graphs of simple polygons of a given number of vertices by treating them as representing simple polygons that are deformations of one another.

3. (iii) To exploit the potential of complete graphs to be natural coordinate systems for addressing the problem of reconstructing a simple polygon from visibility graph.

We intend to achieve this by defining appropriate “betweenness” relationships on points with respect to the edges of the complete graphs.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号