首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
Woodward  P.R. 《Computer》1996,29(10):99-111
I am fortunate to have had access to supercomputers for the last 28 years. Over this time I have used them to simulate time-dependent fluid flows in the compressible regime. Strong shocks and unstable multifluid boundaries, along with the phenomenon of fluid turbulence, have provided the simulation complexity that demands supercomputer power. The supercomputers I have used-the CDC 6600, 7600, and Star-100, the Cray-1, Cray-XMP, Cray-2, and Cray C-90, the Connection Machines CM-2 and CM-5, the Cray T3D, and the Silicon Graphics Challenge Array and Power Challenge Array-span three revolutions in supercomputer design: the introduction of vector supercomputing, parallel supercomputing on multiple CPUs, and supercomputing on hierarchically organized clusters of microprocessors with cache memories. The last revolution is still in progress, so its outcome is somewhat uncertain. I view these design revolutions through the prism of my specialty and through applications of the supercomputers I have used. Also, because these supercomputer design changes have driven equally important changes in numerical algorithms and the programs that implement them, I describe the three revolutions from this perspective  相似文献   

2.
The Communications Assistance for Law Enforcement Act (CALEA) requires telecommunications providers, including VoIP and broadband ISPs, to provide wiretapping capabilities with their services. Law enforcement and the telecommunications industry must work together to set CALEA-compliant standards.  相似文献   

3.
Model predictive control: Review of the three decades of development   总被引:2,自引:0,他引:2  
Three decades have passed since milestone publications by several industrialists spawned a flurry of research and industrial / commercial activities on model predictive control (MPC). This article reviews major developments and achievements during the three decades and attempts to put a perspective on them. The first decade is characterized by the fast-growing industrial adoption of the technology, primarily in the refining and petrochemical sectors, which sparked much interest and also confusion among the academicians. The second decade saw a number of significant advances in understanding the MPC from a control theoretician’s viewpoint, which included state-space interpretations / formulations and stability proofs. These theoretical triumphs contributed to the makings of the second generation of commercial software, which was significantly enhanced in generality and rigor. The third decade’s main focus has been on the development of “fast MPC,” a term chosen to collectively describe the various efforts to bring orders-of-magnitude improvement in the efficiency of the on-line computation so that the technology can be applied to systems requiring very fast sampling rates. Throughout the three decades of the development, theory and practice supported each other quite effectively, a primary reason for the fast and steady rise of the technology.  相似文献   

4.

We present a comprehensive review of the evolutionary design of neural network architectures. This work is motivated by the fact that the success of an Artificial Neural Network (ANN) highly depends on its architecture and among many approaches Evolutionary Computation, which is a set of global-search methods inspired by biological evolution has been proved to be an efficient approach for optimizing neural network structures. Initial attempts for automating architecture design by applying evolutionary approaches start in the late 1980s and have attracted significant interest until today. In this context, we examined the historical progress and analyzed all relevant scientific papers with a special emphasis on how evolutionary computation techniques were adopted and various encoding strategies proposed. We summarized key aspects of methodology, discussed common challenges, and investigated the works in chronological order by dividing the entire timeframe into three periods. The first period covers early works focusing on the optimization of simple ANN architectures with a variety of solutions proposed on chromosome representation. In the second period, the rise of more powerful methods and hybrid approaches were surveyed. In parallel with the recent advances, the last period covers the Deep Learning Era, in which research direction is shifted towards configuring advanced models of deep neural networks. Finally, we propose open problems for future research in the field of neural architecture search and provide insights for fully automated machine learning. Our aim is to provide a complete reference of works in this subject and guide researchers towards promising directions.

  相似文献   

5.
Predetermined motion time systems (PMTS) are widely used in industry for setting production standards. In recent years, the popularity of PMTS has grown considerably. This is largely due to the release of the United States Air Force Standard MIL-STD-1567A, the development and widespread use of microcomputers, and the subjectivity of time studies. This paper reviews and evaluates various PMTS. Recommendations for future research are also made.  相似文献   

6.

The ever-increasing complexity of numerical models and associated computational demands have challenged classical reliability analysis methods. Surrogate model-based reliability analysis techniques, and in particular those using kriging meta-model, have gained considerable attention recently for their ability to achieve high accuracy and computational efficiency. However, existing stopping criteria, which are used to terminate the training of surrogate models, do not directly relate to the error in estimated failure probabilities. This limitation can lead to high computational demands because of unnecessary calls to costly performance functions (e.g., involving finite element models) or potentially inaccurate estimates of failure probability due to premature termination of the training process. Here, we propose the error-based stopping criterion (ESC) to address these limitations. First, it is shown that the total number of wrong sign estimation of the performance function for candidate design samples by kriging, S, follows a Poisson binomial distribution. This finding is subsequently used to estimate the lower and upper bounds of S for a given confidence level for sets of candidate design samples classified by kriging as safe and unsafe. An upper bound of error of the estimated failure probability is subsequently derived according to the probabilistic properties of Poisson binomial distribution. The proposed upper bound is implemented in the kriging-based reliability analysis method as the stopping criterion. The efficiency and robustness of ESC are investigated here using five benchmark reliability analysis problems. Results indicate that the proposed method achieves the set accuracy target and substantially reduces the computational demand, in some cases by over 50%.

  相似文献   

7.
Standards have become a basis of global competition among countries. Although there are many studies of standards and standardization, little is known about how international standards are set. Even less is known about how it occurs in the unprecedented case in which a developing country is actively involved in this process. In the past few years, China, leveraging the huge size of its domestic markets, has attempted to influence international technology standard setting. Standardization, especially at the international level, often revolves around building an alliance surrounding a particular technology. Actor-network theory (ANT) is a theory that helps analyze the ways in which actors form alliances and enroll other actors to strengthen such alliances surrounding a technology. Therefore, we see a fit between the study of standard setting and ANT. In this paper, we use ANT to investigate the process of mobile standard setting in an international context where firms, industry consortia, and governments collaborate and compete in complex ways. It is found that China’s attempt to set WAPI as a national standard failed in enrolling other actors mainly due to the fact that WAPI was too closed a standard even for a de jure one; China did not release the WAPI security algorithm to the scrutiny of the international community.  相似文献   

8.
在工程实践中,经常遇到需要修改设定参数的问题.本文介绍了在SIEMENS SIMATIC S7的可编程控制器中,通过HMI软件WINCC,在上位机的画面上进行运行监控的同时,完成柔性化参数设定的方法.该方法准确可靠,简便易行.  相似文献   

9.
S2S: structural-to-syntactic matching similar documents   总被引:2,自引:2,他引:0  
Management of large collection of replicated data in centralized or distributed environments is important for many systems that provide data mining, mirroring, storage, and content distribution. In its simplest form, the documents are generated, duplicated and updated by emails and web pages. Although redundancy may increase the reliability at a level, uncontrolled redundancy aggravates the retrieval performance and might be useless if the returned documents are obsolete. Document similarity matching algorithms do not provide the information on the differences of documents, and file synchronization algorithms are usually inefficient and ignore the structural and syntactic organization of documents. In this paper, we propose the S2S matching approach. The S2S matching is composed of structural and syntactic phases to compare documents. Firstly, in the structural phase, documents are decomposed into components by its syntax and compared at the coarse level. The structural mapping processes the decomposed documents based on its syntax without actually mapping at the word level. The structural mapping can be applied in a hierarchical way based on the structural organization of a document. Secondly, the syntactic matching algorithm uses a heuristic look-ahead algorithm for matching consecutive tokens with a verification patch. Our two-phase S2S matching approach provides faster results than currently available string matching algorithms.
Ramazan S. AygünEmail:
  相似文献   

10.
D2S: Document-to-sentence framework for novelty detection   总被引:2,自引:1,他引:1  
Novelty detection aims at identifying novel information from an incoming stream of documents. In this paper, we propose a new framework for document-level novelty detection using document-to-sentence (D2S) annotations and discuss the applicability of this method. D2S first segments a document into sentences, determines the novelty of each sentence, then computes the document-level novelty score based on a fixed threshold. Experimental results on APWSJ data show that D2S outperforms standard document-level novelty detection in terms of redundancy-precision (RP) and redundancy-recall (RR). We applied D2S on the document-level data from the TREC 2004 and TREC 2003 Novelty Track and find that D2S is useful in detecting novel information in data with a high percentage of novel documents. However, D2S shows a strong capability to detect redundant information regardless of the percentage of novel documents. D2S has been successfully integrated in a real-world novelty detection system.  相似文献   

11.
Smith SA  Norris BJ 《Ergonomics》2004,47(11):1195-1207
The major sources of published anthropometric data on children are now over two decades old. Due to concern being expressed regarding the continued validity of such data, changes in the body sizes of the UK child population over the past three decades have been considered. Comparisons were also made between the size of the current UK child population to the current US child population, and to the most comprehensive source of measured data on US children (but which are now over 20 years old). The growth of children in the UK and US over the past three decades was assessed for an indication of secular growth trends. Stature increases were found to have generally been less than body weight increases (as a percentage) at 5th percentile, mean and 95th percentile levels for UK children, and UK children were found to be closer in size to US children now than they were 30 years ago.  相似文献   

12.
《Ergonomics》2012,55(11):1195-1207
The major sources of published anthropometric data on children are now over two decades old. Due to concern being expressed regarding the continued validity of such data, changes in the body sizes of the UK child population over the past three decades have been considered. Comparisons were also made between the size of the current UK child population to the current US child population, and to the most comprehensive source of measured data on US children (but which are now over 20 years old). The growth of children in the UK and US over the past three decades was assessed for an indication of secular growth trends. Stature increases were found to have generally been less than body weight increases (as a percentage) at 5th percentile, mean and 95th percentile levels for UK children, and UK children were found to be closer in size to US children now than they were 30 years ago.  相似文献   

13.
Software engineering standards often utilize different underpinning metamodels and ontologies, which sometimes differ between standards. For better adoption by industry, harmonization of these standards by use of a domain ontology has been advocated. In this paper we apply this approach in a proof of concept project. We recommend the creation of a single underpinning abstract domain ontology, created from existing ISO/IEC standards including ISO/IEC 24744 and 24765 and supplemented by any other sources authorized by SC7 as being appropriate and useful. Such an adoption of a single ontology will permit the re-engineering of existing International Standards such as 12207, 15288 and 33061 as refinements from this domain ontology so that these variously focussed standards can all inter-operate.  相似文献   

14.
15.
《Data Processing》1983,25(10):21-24
Teletex allows the transfer of letter quality documents on an international basis between terminals supplied by different manufacturers. The paper describes international agreements covering the standardization of teletex transmission, the costs and advantages of teletex, and provides a review of the likely future developments in this field.  相似文献   

16.
《Data Processing》1985,27(4):37-39
Many countries now operate videotex services and as the case with any new technology there is a danger of evolving many incompatible international standards. A unified videotex standard can be developed to make videotex a truly international electronic information service.  相似文献   

17.
18.
S U S Y 2     
This package deals with supersymmetric functions and with the algebra of supersymmetric operators in extended N = 2 as well as in nonextended N = 1 supersymmetry. It allows us to make a realization of the SuSy algebra of differential operators, compute the gradients of given SuSy Hamiltonians and to obtain the SuSy version of soliton equations using the SuSy Lax approach. There are also many additional procedures included that are also encountered in the SuSy soliton approach, as for example the conjugation of a given SuSy operator, the computation of a general form of SuSy Hamiltonians (up to SuSy divergence equivalence), and the checking of the validity of the Jacobi identity for some SuSy Hamiltonian operators.  相似文献   

19.
《Computers and Standards》1983,2(2-3):143-146
The status of possible ‘candidate’ languages for format standardisation, particularly at international level, is reviewed, with special reference to developments during the period January 1980 to May 1983.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号