The early history of applying electronic computers to the task of translating natural languages is chronicled, from the first suggestions by Warren Weaver in March 1947 to the first demonstration of a working, if limited, program in January 1954. 相似文献
The achievement of design and development solutions can be enhanced through consulting appropriate guidelines Although a wide range exist, frequently their full benefits are not realized by guideline-users because of the costs associated with their use. Guideline-users are people who use guidelines to support purposeful activity. Major cost drivers for guideline-users are the processes of 'selecting' appropriate guidelines and their subsequent 'translation' to an applied setting both of which can be prohibitively expensive. A strategy for producing guidelines is proposed, to minimize these costs, which is illustrated by the use of a case study concerned with the development of guidelines to assist in the production of management and administrative tools which will support project managers concerned with Human Factors Acceptance Testing. A process to support the assessment of guidelines is also proposed. 相似文献
Databases are a critical element of virtually all conventional and ebusiness applications. How does an organization know if the information derived from the database is any good? To ensure a quality database application, should the emphasis during model development be on the application of quality assurance metrics (designing it right)? A large number of database applications fail or are unusable. A quality process does not necessarily lead to a usable database product. A database application can also be ‘well-formed’ with high data quality but lack semantic or cognitive fidelity (the right design). This paper expands on the growing body of literature in the area of data quality by proposing additions to a hierarchy of database quality dimensions that includes model and behavioral factors in addition to process and data factors.
The performance of conjugate gradient (CG) algorithms for the solution of the system of linear equations that results from the finite-differencing of the neutron diffusion equation was analyzed on SIMD, MIMD, and mixed-mode parallel machines. A block preconditioner based on the incomplete Cholesky factorization was used to accelerate the conjugate gradient search. The issues involved in mapping both the unpreconditioned and preconditioned conjugate gradient algorithms onto the mixed-mode PASM prototype, the SIMD MasPar MP-1, and the MIMD Intel Paragon XP/S are discussed. On PASM , the mixed-mode implementation outperformed either SIMD or MIMD alone. Theoretical performance predictions were analyzed and compared with the experimental results on the MasPar MP-1 and the Paragon XP/S. Other issues addressed include the impact on execution time of the number of processors used, the effect of the interprocessor communication network on performance, and the relationship of the number of processors to the quality of the preconditioning. Applications studies such as this are necessary in the development of software tools for mapping algorithms onto either a single parallel machine or a heterogeneous suite of parallel machines. 相似文献
Optimizing compilers for data-parallel languages such as High Performance Fortran perform a complex sequence of transformations. However, the effects of many transformations are not independent, which makes it challenging to generate high quality code. In particular, some transformations introduce conditional control flow, while others make some conditionals unnecessary by refining program context. Eliminating unnecessary conditional control flow during compilation can reduce code size and remove a source of overhead in the generated code. This paper describes algorithms to compute symbolic constraints on the values of expressions used in control predicates and to use these constraints to identify and remove unnecessary conditional control flow. These algorithms have been implemented in the Rice dHPF compiler and we show that these algorithms are effective in reducing the number of conditionals and the overall size of generated code. Finally, we describe a synergy between control flow simplification and data-parallel code generation based on loop splitting which achieves the effects of more narrow data-parallel compiler optimizations such as vector message pipelining and the use of overlap areas. 相似文献
The paper describes the fabrication of a novel miniature sensor for electrical tomography. The sensor comprises a number of copper electrodes that are fabricated around a small hole that is etched through a silicon wafer. Copper electrodes are electroplated to fill channels that are formed in thick photo-resist on top of the silicon wafer. Electrodes with a thickness of 60 μm, surrounding a hole of diameter 300 μm, have been realised. Initial measurements have been made using a commercial LCR meter applied to an eight-electrode sensor and images of a 80 μm diameter wire have been obtained. Future work will consider the integration of measurement circuitry alongside the electrodes in order to reduce parasitic capacitances. 相似文献
The modern enterprise has become increasingly dependent on data and its value-added forms of information and knowledge to remain competitive in the face of global competition and constant change. Data architecture provides the framework necessary to use and share data more effectively, and to improve the flow of data between systems within the enterprise and between the enterprise's systems and those of its customers, suppliers, and business partners. Data architecture standards constitute the foundation of an effective data architecture. 相似文献