首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
面对高性能计算机系统随着性能的提高其复杂性成倍增大的挑战,研究了复杂科学计算应用的优化,提出了一种面向软硬件特性设计的性能优化框架CPTF。该框架根据应用在运行时的剖析结果,结合应用的软件特性和平台的硬件特性,全局性地分析系统性能瓶颈及种类,并给出源码级的优化建议,并针对优化循环一类常见的问题,提出一种改进的循环合并算法。使用CPTF优化了一个物质点法粒子模拟应用,取得了近20%的性能提升。  相似文献   

2.
A study is made of process simulation in inverse situations. Some problems arising in this approach are discussed, and a study is made of the choice of solution form, as well as of solution technique.Notation La model - a unknown parameter vector - u function describing process - u observed values - true state - measurement error - norm of uncertainty - f effect - U, H, F metric spaces - T upper limit of measurement - uam ambient temperature Translated from Inzhenerno-Fizicheskii Zhurnal, Vol. 39, No. 2, pp. 236–241, August, 1980.  相似文献   

3.
Because of high efficiency, energy conservation, simple operation, wide application range, and small size, the high-speed universal pulverizer has been well received by customers. However, its electrical motor can overheat when working, which hinders continuous operation of the pulverizer. In this study, a series of efforts were made to address this problem. Firstly, a detailed analysis of the working principle of the pulverizer was conducted and an optimization plan was proposed, consisting in punching ventilation holes on the surface of the original pulverizer. Simulations of the pulverizer flow field before and after optimization were performed. The hydrodynamic simulation results were used to conduct a steady state thermal analysis of the pulverizer, investigating the influence of the flow field on heat transfer. Additionally, experimental investigations were conducted on the pulverizer before and after optimization in order to measure and compare the parameters (motor working temperature, wind speed and temperature of the motor cooling system, vibration, noise, and pulverizing degree of the material) influencing the performance of the pulverizer. The numerical simulation results showed an increment in heat transfer caused by increment in air flow volume and velocity when air was injected into the pulverizer through bottom and side holes. Experimental results showed that the pulverizer with air injection through holes had the best performance when temperature, vibration, and refinement effect were considered as performance indicators. The full text can be downloaded at https://link.springer.com/article/10.1007/s40436-017-0208-3  相似文献   

4.
Maxville  V. Armarego  J. Lam  C.P. 《Software, IET》2009,3(5):369-380
With increasing use of component-based development (CBD), the process for selecting software from repositories is a critical concern for quality systems development. As support for developers blending in-house and third party software, the context-driven component evaluation (CdCE) process provides a three-phase approach to software selection: filtering to a short list, functional evaluation and ranking. The process was developed through iterative experimentation on real-world data. CdCE has tool support to generate classifier models, shortlists and test cases as artefacts that provide for a repeatable, transparent process that can be reused as the system evolves. Although developed for software component selection, the CdCE process framework can be easily modified for other selection tasks by substituting templates, tools, evaluation criteria and/or repositories. In this article the authors describe the CdCE process and its development, the CdCE framework as a reusable pattern for software selection and provide a case study where the process is applied.  相似文献   

5.
Research on the optimization of stochastic systems via simulation often centers on the development of algorithms for which global convergence can be guaranteed. On the other hand, commercial software applications that perform optimation via simulation typically employ search heuristics that have been successful in deterministic settings. Such search heuristics give up on global convergence in order to be more generally applicable and to yield rapid progress towards good solutions. Unfortunately, commercial applications do not always formally account for the randomness in simulation responses, meaning that their progress may be no better than a random search if the variability of the outputs is high. In addition, they do not provide statistical guarantees about the "goodness" of the final results. In practice, simulation studies often rely heavily on engineers who, in addition to developing the simulation model and generating the alternatives to be compared, must also perform the statistical analyses off-line. This is a time- and labor-consuming process. In this paper, we report on the work we have done to implement statistical error control within a heuristic search procedure, and on our automated procedure to deliver a statistical guarantee after the search procedure is finished. We describe how we implemented these techniques in software developed for JGC Corporation of Japan.  相似文献   

6.
Sotirios K. Goudos   《Materials & Design》2007,28(10):2585-2595
A computer-aided design (CAD) tool for the design of planar multi-layer coatings with high absorption for a desired frequency and angle range is presented. The tool uses deterministic and evolutionary optimization design methods. Both single and multi-objective design algorithms can be used and a single absorber design or the Pareto front can be found accordingly. A novel design technique utilizing PSO is also presented. A user-defined or a pre-defined design case can be selected interchangeably. The choice of selecting materials from pre-defined database is also available. The tool can be useful for both educational and research purposes. The efficiency of the tool is demonstrated through several design cases that are in agreement with existing literature data.  相似文献   

7.
A generic constraint handling framework for use with any swarm-based optimization algorithm is presented. For swarm optimizers to solve constrained optimization problems effectively modifications have to be made to the optimizers to handle the constraints, however, these constraint handling frameworks are often not universally applicable to all swarm algorithms. A constraint handling framework is therefore presented in this paper that is compatible with any swarm optimizer, such that a user can wrap it around a chosen swarm algorithm and perform constrained optimization. The method, called separation-sub-swarm, works by dividing the population based on the feasibility of individual agents. This allows all feasible agents to move by existing swarm optimizer algorithms, hence promoting good performance and convergence characteristics of individual swarm algorithms. The framework is tested on a suite of analytical test function and a number of engineering benchmark problems, and compared to other generic constraint handling frameworks using four different swarm optimizers; particle swarm, gravitational search, a hybrid algorithm and differential evolution. It is shown that the new framework produces superior results compared to the established frameworks for all four swarm algorithms tested. Finally, the framework is applied to an aerodynamic shape optimization design problem where a shock-free solution is obtained.  相似文献   

8.
Manoj Kumar  Arun Sharma 《Sadhana》2017,42(9):1481-1493
Nowadays, the number of software vulnerabilities incidents and the loss due to occurrence of software vulnerabilities are growing exponentially. The current existing security strategies, the vulnerability detection and remediating approaches are not intelligent, automated, self-managed and not competent to combat against the vulnerabilities and security threats, and to provide secured self-managed software environment to the organizations. Hence, there is a strong need to devise an intelligent and automated approach to optimize security and prevent the occurrence of vulnerabilities or mitigate the vulnerabilities. The autonomic computing is a nature-inspired and self-management-based computational model. In this paper, an autonomic-computing-based integrated framework is proposed to detect, fire the trigger of alarm, assess, classify, prioritize, mitigate and manage the software vulnerability automatically. The proposed framework uses a knowledge base and inference engine, which automatically takes the remediating actions on future occurrence of software security vulnerabilities through self-configuration, self-healing, self-prevention and self-optimization as per the needs. The proposed framework is beneficial to industry and society in various aspects because it is an integrated, cross-concern and intelligent framework and provides more secured self-managed environment to the organizations. The proposed framework reduces the security risks and threats, and also monetary and reputational loss. It can be embedded easily in existing software and incorporated or implemented as an inbuilt integral component of the new software during software development.  相似文献   

9.
Swarm algorithms such as particle swarm optimization (PSO) are non-gradient probabilistic optimization algorithms that have been successfully applied for global searches in complex problems such as multi-peak problems. However, application of these algorithms to structural and mechanical optimization problems still remains a complex matter since local optimization capability is still inferior to general numerical optimization methods. This article discusses new swarm metaphors that incorporate design sensitivities concerning objective and constraint functions and are applicable to structural and mechanical design optimization problems. Single- and multi-objective optimization techniques using swarm algorithms are combined with a gradient-based method. In the proposed techniques, swarm optimization algorithms and a sequential linear programming (SLP) method are conducted simultaneously. Finally, truss structure design optimization problems are solved by the proposed hybrid method to verify the optimization efficiency.  相似文献   

10.
Computerized ionospheric tomography (CIT) is one of the most recent developments in the area of remote sensing of the ionosphere. This system is a special case of limited angle tomography in which not only are projection angles limited, but the number of samples per projection varies. This article presents an orthogonal decomposition framework for unifying CIT algorithms including both generalized classical algorithms such as the algebraic reconstruction technique, the direct Fourier method, and filtered backprojection, and algorithms using basis functions from a priori information. This article discusses the orthogonality of the basis functions associated with the classical techniques and presents simulations comparing the use of a priori information in filtered backprojection and orthogonal decomposition.©1994 John Wiley & Sons Inc  相似文献   

11.
Nutrient monitoring is very important for the area of food–energy–water nexus. The sensor network for nutrient monitoring requires dynamic sensing where the positions of the sensors change with time. In this work, we have proposed a methodology to optimize a dynamic sensor network which can address the spatiotemporal aspect of nutrient movement in a watershed. This is a first paper in the series where an algorithmic and methodological framework for spatiotemporal sensor placement problem is proposed. Dynamic sensing is widely used in wireless sensors, and the current approaches to solving this problem are data intensive. This is the first time we are introducing a stochastic optimization approach to dynamic sensing which is efficient. This framework is based on a novel stochastic optimization algorithm called Better Optimization of Nonlinear Uncertain Systems (BONUS). A small case study of the dynamic sensor placement problem is presented to illustrate the approach. In the second paper of this series, we will present a detailed case study of nutrient monitoring in a watershed.  相似文献   

12.
Comparing, or benchmarking, of optimization algorithms is a complicated task that involves many subtle considerations to yield a fair and unbiased evaluation. In this paper, we systematically review the benchmarking process of optimization algorithms, and discuss the challenges of fair comparison. We provide suggestions for each step of the comparison process and highlight the pitfalls to avoid when evaluating the performance of optimization algorithms. We also discuss various methods of reporting the benchmarking results. Finally, some suggestions for future research are presented to improve the current benchmarking process.  相似文献   

13.
This paper proposes a generalized bi-level decentralized framework to model collaborative design problems over autonomous stakeholders with each having different objectives. At the system level, a system solution derived from the Pareto concept is created. A facilitator agent is introduced to search for Pareto optimal solutions based on a Memetic Algorithm (MA). At the design disciplinary level, design agents representing design teams are introduced to optimize their own objectives. The proposed framework will guide the collaborative designers to converge to Pareto optimal solutions given any forms of design utility functions. The only information exchanged between the two levels is numerical values instead of utility functions. Therefore sensitive (private) design information can be protected. Three comparison experiments are conducted to evaluate the solution quality and explore the applicability of the proposed framework to collaborative design problems.  相似文献   

14.
The application of genetic algorithms (GAs) to the optimization of piecewise linear discriminants is described. Piecewise linear discriminant analysis (PLDA) is a supervised pattern recognition technique employed in this work for the automated classification of Fourier transform infrared (FTIR) remote sensing data. PLDA employs multiple linear discriminants to approximate a nonlinear separating surface between data categories defined in a vector space. The key to the successful implementation of PLDA is the positioning of the individual discriminants that comprise the piecewise linear discriminant. For the remote sensing application, the discriminant optimization is challenging due to the large number of input variables required and the corresponding tendency for local optima to occur on the response surface of the optimization. In this work, three implementations of GAs are configured and evaluated: a binary-coded GA (GAB), a real-coded GA (GAR), and a Simplex-GA hybrid (SGA). GA configurations are developed by use of experimental design studies, and piecewise linear discriminants for acetone, methanol, and sulfur hexafluoride are optimized (trained). The training and prediction classification results indicate that GAs area viable approach for discriminant optimization. On average, the best piecewise linear discriminant optimized by a GA is observed to classify 11% more analyte-active patterns correctly in prediction than an unoptimized piecewise linear discriminant. Discriminant optimization problems not used in the experimental design study are employed to test the stability of the GA configurations. For these cases, the best piecewise linear discriminant optimized by SGA is shown to classify 19% more analyte-active patterns correctly in prediction than an unoptimized discriminant. These results also demonstrate that the two real number coded GAs (GAR and SGA) perform better than the GAB. Real number coded GAs are also observed to execute faster and are simpler to implement.  相似文献   

15.
Imaging using synthetic aperture techniques is a mature technique with a host of different reconstruction algorithms available. Often the same basic algorithm has a different name depending on where the particular algorithm is used, since it may have originated from the medical, nondestructive testing, geological, or remote sensing fields. All this adds to confusion for the nonspecialist. This article gives a short historical precise of active synthetic aperture imaging as it applies to airborne, spaceborne, and underwater remote sensing systems using either radar or sonar, then defines some generic imaging geometry and places all the usable synthetic aperture reconstruction algorithms in a unified framework. This is done by the introduction of mapping operators, which simplify the mapping or reformatting of data from one sampling grid to another. Using these operators, readers can see how strip-map synthetic aperture systems (both radar- and sonar-based) differ from spotlight synthetic aperture systems, how the various algorithms fit together, and how the chirp-scaling algorithm is likely to be the reconstruction algorithm of choice for most future strip-map systems, and just why that should be so. Multilook processing and methods to deal with undersampled apertures using postdetection digital spotlighting are put into the same unified framework, as both of these techniques are frequent adjuncts to synthetic aperture imaging. © 1997 John Wiley & Sons, Inc. Int J Imaging Syst Technol, 8, 343–358, 1997  相似文献   

16.
We introduce MISO, the mixed-integer surrogate optimization framework. MISO aims at solving computationally expensive black-box optimization problems with mixed-integer variables. This type of optimization problem is encountered in many applications for which time consuming simulation codes must be run in order to obtain an objective function value. Examples include optimal reliability design and structural optimization. A single objective function evaluation may take from several minutes to hours or even days. Thus, only very few objective function evaluations are allowable during the optimization. The development of algorithms for this type of optimization problems has, however, rarely been addressed in the literature. Because the objective function is black-box, derivatives are not available and numerically approximating the derivatives requires a prohibitively large number of function evaluations. Therefore, we use computationally cheap surrogate models to approximate the expensive objective function and to decide at which points in the variable domain the expensive objective function should be evaluated. We develop a general surrogate model framework and show how sampling strategies of well-known surrogate model algorithms for continuous optimization can be modified for mixed-integer variables. We introduce two new algorithms that combine different sampling strategies and local search to obtain high-accuracy solutions. We compare MISO in numerical experiments to a genetic algorithm, NOMAD version 3.6.2, and SO-MI. The results show that MISO is in general more efficient than NOMAD and the genetic algorithm with respect to finding improved solutions within a limited budget of allowable evaluations. The performance of MISO depends on the chosen sampling strategy. The MISO algorithm that combines a coordinate perturbation search with a target value strategy and a local search performs best among all algorithms.  相似文献   

17.
This article introduces Hessian approximation algorithms to estimate the search direction of the quasi-Newton methods for solving optimization problems of continuous parameters. The proposed algorithms are quite different from other well-known quasi-Newton methods, such as symmetric rank-one, Davidon–Fletcher–Powell, and Broyden–Fletcher–Goldfarb–Shanno, in that the Hessian matrix is not calculated from the gradient information, rather directly from the function values. The proposed algorithms are designed for a class of hybrid algorithms that combine evolutionary search with the gradient-based methods of quasi-Newton type. The function values calculated for the evolutionary search are used for estimation of the Hessian matrix (or its inverse) as well as the gradient vector. Since the estimation process of the Hessian matrix is independent of that of the gradient vector, more reliable Hessian estimation with a small population is possible compared with the previous methods based upon the classical quasi-Newton methods. Numerical experiments show that the proposed algorithms are very competitive with state-of-the-art evolutionary algorithms for continuous optimization problems.  相似文献   

18.
测试软件的复用性是摆在测试界面前亟待解决的重要问题。运用面向对象的设计方法,基于层次体系结构理论,以舵机测试系统为模型,提出了一种通用测试软件的三层结构。在实现中,按照面向对象设计原则,采用UML进行建模,使用设计模式和C#反射机制提高软件复用性,达到了使用同一套软件完成舵机测试领域内同一系列产品测试的目标。  相似文献   

19.
Excessive changes in MRP system schedules, commonly referred to as 'nervousness’, is frequently an obstacle in implementing an effective MRP based manufacturing planning and control system. This paper is concerned with the design of methods for freezing the master production schedule (MPS) as a way of controlling MPS stability under rolling planning conditions for make-to-stock products. It presents a framework for the design of MPS freezing methods and compares their performance when the design parameters of these methods are varied. Simulation experiments are reported that demonstrate important differences in performance considering criteria involving both the cost of lot-sizing the MPS and the stability of the master production schedule.  相似文献   

20.
This paper discusses the use of genetic algorithms (GA) within the area of reliability, availability, maintainability and safety (RAMS) optimization. First, the multi-objective optimization problem is formulated in general terms and two alternative approaches to its solution are illustrated. Then, the theory behind the operation of GA is presented. The steps of the algorithm are sketched to some details for both the traditional breeding procedure as well as for more sophisticated breeding procedures. The necessity of affine transforming the fitness function, object of the optimization, is discussed in detail, together with the transformation itself. In addition, how to handle constraints by the penalization approach is illustrated. Finally, specific metrics for measuring the performance of a genetic algorithm are introduced.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号