首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 314 毫秒
1.
Metamodel: A key to intelligent CAD systems   总被引:2,自引:2,他引:0  
We introduce the metamodel as a new modeling framework for design objects based on General Design Theory, a mathematical model of design. Using General Design Theory, the metamodel concept can serve three functions: (1) as a central modeling mechanism to integrate models, (2) as a mechanism for modeling physical phenomena, and (3) as a tool for describing evolving design objects. Modeling with multiple points of view is realized by representing physical phenomena that occur in the deisng object and by constructing models with knowledge of physics and design from the metamodel. We illustrate the first and second functions of metamodels with an example based on naive physics, and we illustrate the third function of the metamodel through design experiments. Finally, we present two systems to illustrate how the metamodel mechanism can be implemented.  相似文献   

2.
Chong Chen  Huili Yu  Hui Zhao 《工程优选》2013,45(10):1761-1776
In engineering design optimization, the usage of hybrid metamodels (HMs) can take full advantage of the individual metamodels, and improve robustness of the predictions by reducing the impact of a poor metamodel. When there are plenty of candidates, it is difficult to make decisions on which metamodels to choose before building an HM. The decisions should simultaneously take into account of the number, accuracy and diversity of the selected metamodels. To address this problem, this research developed an efficient decision-making framework based on partial least squares for metamodel screening. A new significance index is firstly derived from the view of fitting error in a regression model. Then, a desirable metamodel combination which consist of only the significant ones is subsequently configured for further constructing the final HM. The effectiveness of the proposed framework is demonstrated through several benchmark problems.  相似文献   

3.
Artificial neural networks are often proposed as an alternative approach for formalizing various quantitative and qualitative aspects of complex systems. This paper examines the robustness of using neural networks as a simulation metamodel to estimate manufacturing system performances. Simulation models of a job shop system are developed for various configurations to train neural network metamodels. Extensive computational tests are carried out with the proposed models at various factor levels (study horizon, system load, initial system status, stochasticity, system size and error assessment methods) to see the metamodel accuracy. The results indicate that simulation metamodels with neural networks can be effectively used to estimate the system performances.  相似文献   

4.
Metamodels are models of simulation models. Metamodels are able to estimate the simulation responses corresponding to a given combination of input variables. A simulation metamodel is easier to manage and provides more insights than simulation alone. Traditionally, the multiple regression analysis is utilized to develop the metamodel from a set of simulation experiments. Simulation can consequentially benefit from the metamodelling in post-simulation analysis. A backpropagation (BP) neural network is a proven tool in providing excellent response predictions in many application areas and it outperforms regression analysis for a wide array of applications. In this paper, a BP neural network is used to generate metamodels for simulated manufacturing systems. For the purpose of optimal manufacturing systems design, mathematical models can be formulated by using the mapping functions generated from the neural network metamodels. The optimization model is then solved by a stochastic local search approach, simulated annealing (SA), to obtain an optimal configuration with respect to the objective of the systems design. Instead of triggering the detailed simulation programs, the SA-based optimization procedure evaluates the simulation outputs by the neural network metamodels. By using the SA-based optimization algorithm, the solution space of the studied problem is extensively exploited to escape the entrapment of local optima while the number of time consuming simulation runs is reduced. The proposed methodology is illustrated to be both effective and efficient in solving a manufacturing systems design problem through an example.  相似文献   

5.
Whereas an optimal Pseudo-Random Number (PRN) assignment strategy for simulation experiments involving the estimation of linear metamodels currently exists, no such optimal assignment strategy for quadratic metamodels has been proposed. This situation is now rectified by the introduction of a PRN assignment strategy for a quadratic metamodel for 3k factorial designs. In addition to extending the theory from linear to quadratic metamodels, the proposed PRN strategy is shown to be superior to a number of existing and competing strategies in terms of various variance measures.  相似文献   

6.
Junqi Yang  Kai Zheng  Jie Hu  Ling Zheng 《工程优选》2016,48(12):2026-2045
Metamodels are becoming increasingly popular for handling large-scale optimization problems in product development. Metamodel-based reliability-based design optimization (RBDO) helps to improve the computational efficiency and reliability of optimal design. However, a metamodel in engineering applications is an approximation of a high-fidelity computer-aided engineering model and it frequently suffers from a significant loss of predictive accuracy. This issue must be appropriately addressed before the metamodels are ready to be applied in RBDO. In this article, an enhanced strategy with metamodel selection and bias correction is proposed to improve the predictive capability of metamodels. A similarity-based assessment for metamodel selection (SAMS) is derived from the cross-validation and similarity theories. The selected metamodel is then improved by Bayesian inference-based bias correction. The proposed strategy is illustrated through an analytical example and further demonstrated with a lightweight vehicle design problem. The results show its potential in handling real-world engineering problems.  相似文献   

7.
This article proposes a new method for hybrid reliability-based design optimization under random and interval uncertainties (HRBDO-RI). In this method, Monte Carlo simulation (MCS) is employed to estimate the upper bound of failure probability, and stochastic sensitivity analysis (SSA) is extended to calculate the sensitivity information of failure probability in HRBDO-RI. Due to a large number of samples involved in MCS and SSA, Kriging metamodels are constructed to substitute true constraints. To avoid unnecessary computational cost on Kriging metamodel construction, a new screening criterion based on the coefficient of variation of failure probability is developed to judge active constraints in HRBDO-RI. Then a projection-outline-based active learning Kriging is achieved by sequentially select update points around the projection outlines on the limit-state surfaces of active constraints. Furthermore, the prediction uncertainty of Kriging metamodel is quantified and considered in the termination of Kriging update. Several examples, including a piezoelectric energy harvester design, are presented to test the accuracy and efficiency of the proposed method for HRBDO-RI.  相似文献   

8.
In the field of engineering design and optimization, metamodels are widely used to replace expensive simulation models in order to reduce computing costs. To improve the accuracy of metamodels effectively and efficiently, sequential sampling designs have been developed. In this article, a sequential sampling design using the Monte Carlo method and space reduction strategy (MCSR) is implemented and discussed in detail. The space reduction strategy not only maintains good sampling properties but also improves the efficiency of the sampling process. Furthermore, a local boundary search (LBS) algorithm is proposed to efficiently improve the performance of MCSR, which is called LBS-MCSR. Comparative results with several sequential sampling approaches from low to high dimensions indicate that the space reduction strategy generates samples with better sampling properties (and thus better metamodel accuracy) in less computing time.  相似文献   

9.
To analyze a simulation (response surface) metamodel that involves a variance-stabilizing transformation of the original simulation-generated response, we present two techniques. In the first technique we compute an approximate percentile-type confidence interval for the mean of the original response at a selected factor-level combination (design point) as follows: we compute the usual confidence interval for the mean of the transformed response at that design point; and then we untransform the corresponding endpoints to obtain the desired confidence interval for the untransformed metamodel. In the second technique we compute the Maximum Likelihood Estimator (MLE) for the mean of the untransformed response based on standard distributional properties of the transformed metamodel; then using the delta method to approximate the MLE's variance, we construct for the untransformed metamodel an asymptotically exact confidence interval centered on the MLE. We illustrate these techniques in a case study on manufacturing cell design, comparing them with a more conventional approach for analyzing transformed-based simulation metamodels. A Monte Carlo performance evaluation shows that significantly better confidence-interval coverage is maintained with the second proposed technique (called the "MLE-delta method") over a wide range of values for the residual variance of the transformed metamodel.  相似文献   

10.
In the course of designing structural assemblies, performing a full optimization is very expensive in terms of computation time. In order or reduce this cost, we propose a multilevel model optimization approach. This paper lays the foundations of this strategy by presenting a method for constructing an approximation of an objective function. This approach consists in coupling a multiparametric mechanical strategy based on the LATIN method with a gradient-based metamodel called a cokriging metamodel. The main difficulty is to build an accurate approximation while keeping the computation cost low. Following an introduction to multiparametric and cokriging strategies, the performance of kriging and cokriging models is studied using one- and two-dimensional analytical functions; then, the performance of metamodels built from mechanical responses provided by the multiparametric strategy is analyzed based on two mechanical test examples.  相似文献   

11.
Metamodel-based global optimization methods have been extensively studied for their great potential in solving expensive problems. In this work, a design space management strategy is proposed to improve the accuracy and efficiency of metamodel-based optimization methods. In this strategy, the whole design space is divided into two parts: the important region constructed using several expensive points and the other region. Combined with a previously developed hybrid metamodel strategy, a hybrid metamodel-based design space management method (HMDSM) is developed. In this method, three representative metamodels are used simultaneously in the search of the global optimum in both the important region and the other region. In the search process, the important region is iteratively reduced and the global optimum is soon captured. Tests using a series of benchmark mathematical functions and a practical expensive problem demonstrate the excellent performance of the proposed method.  相似文献   

12.
Optimization under uncertainty requires proper handling of those input parameters that contain scatter. Scatter in input parameters propagates through the process and causes scatter in the output. Stochastic methods (e.g. Monte Carlo) are very popular for assessing uncertainty propagation using black-box function metamodels. However, they are expensive. Therefore, in this article a direct method of calculating uncertainty propagation has been employed based on the analytical integration of a metamodel of a process. Analytical handling of noise variables not only improves the accuracy of the results but also provides the gradients of the output with respect to input variables. This is advantageous in the case of gradient-based optimization. Additionally, it is shown that the analytical approach can be applied during sequential improvement of the metamodel to obtain a more accurate representative model of the black-box function and to enhance the search for the robust optimum.  相似文献   

13.
In the assembly line systems of the electronics industry, production controls based on steady-state conditions have proved ineffectual in coping with dynamic events, such as machine breakdowns, part supply shortages, and high priority job order processing, which can occur individually or simultaneously.

There are few tools that give results quickly as to how many assemblies will be delayed by a dynamic event and how long it will take to recover from the impact. Computer simulation is available, but its lengthy execution time has hindered its application in real time.

Previous research discovered that the patterns of these dynamic events can be represented by metamodels in the solution form of first order systems. In this paper, these results are unified and it is further shown that compound dynamic event metamodels can be developed from the individual metamodels using linear additivity. These metamodels are fit to the output from the simulation; and the resulting equations can be used in real time to measure the number of assemblies that are delayed due to the dynamic events.

To demonstrate the potential application of compound dynamic event metamodels for decision making in real-time production control, a Decision Support System (DSS) is described that contains the dynamic metamodels in its model base. With this program, the impact of the dynamic events on production can be obtained virtually instantaneously. A case study is presented to support this conclusion.  相似文献   

14.
This article presents methods to enhance the efficiency of Evolutionary Algorithms (EAs), particularly those assisted by surrogate evaluation models or metamodels. The gain in efficiency becomes important in applications related to industrial optimization problems with a great number of design variables. The development is based on the principal components analysis of the elite members of the evolving EA population, the outcome of which is used to guide the application of evolution operators and/or train dependable metamodels/artificial neural networks by reducing the number of sensory units. Regarding the latter, the metamodels are trained with less computing cost and yield more relevant objective function predictions. The proposed methods are applied to constrained, single- and two-objective optimization of thermal and hydraulic turbomachines.  相似文献   

15.
ABSTRACT

Higher modeling efficiency is an important goal for the modeling of a Kriging (KG) metamodel, and the sampling approach affects the modeling efficiency directly. Considering the effect of the employed correlation model on prediction accuracy of a KG model, a multiple KG models based parallel adaptive sampling strategy (MKPAS) is proposed using the combination forecasting method, in which the added new points in the sampling process are determined using multiple KG models with different correlation models. The effectiveness of the proposed approach is verified by two low dimensional benchmark functions as well as a high dimensional one. And an engineering application is also used to demonstrate the effectiveness of the proposed MKPAS approach. The results show that the proposed approach can improve the modeling efficiency of a KG model significantly compared with other ordinary sampling approaches.  相似文献   

16.
F. Xiong  Y. Xiong  S. Yang 《工程优选》2013,45(8):793-810
Space-filling and projective properties are desired features in the design of computer experiments to create global metamodels to replace expensive computer simulations in engineering design. The goal in this article is to develop an efficient and effective sequential Quasi-LHD (Latin Hypercube design) sampling method to maintain and balance the two aforementioned properties. The sequential sampling is formulated as an optimization problem, with the objective being the Maximin Distance, a space-filling criterion, and the constraints based on a set of pre-specified minimum one-dimensional distances to achieve the approximate one-dimensional projective property. Through comparative studies on sampling property and metamodel accuracy, the new approach is shown to outperform other sequential sampling methods for global metamodelling and is comparable to the one-stage sampling method while providing more flexibility in a sequential metamodelling procedure.  相似文献   

17.
Jinglai Wu  Zhen Luo  Nong Zhang 《工程优选》2013,45(9):1264-1288
The accuracy of metamodelling is determined by both the sampling and approximation. This article proposes a new sampling method based on the zeros of Chebyshev polynomials to capture the sampling information effectively. First, the zeros of one-dimensional Chebyshev polynomials are applied to construct Chebyshev tensor product (CTP) sampling, and the CTP is then used to construct high-order multi-dimensional metamodels using the ‘hypercube’ polynomials. Secondly, the CTP sampling is further enhanced to develop Chebyshev collocation method (CCM) sampling, to construct the ‘simplex’ polynomials. The samples of CCM are randomly and directly chosen from the CTP samples. Two widely studied sampling methods, namely the Smolyak sparse grid and Hammersley, are used to demonstrate the effectiveness of the proposed sampling method. Several numerical examples are utilized to validate the approximation accuracy of the proposed metamodel under different dimensions.  相似文献   

18.
The unknown input parameters of a simulation code are usually adjusted by the nonlinear least squares estimation (NLSE) method which minimizes the sum of differences between computer responses and real observations. However, when a simulation program is very complex and takes several hours for one execution, the NLSE method may not be computationally feasible. In this case, one may build a statistical metamodel which approximates the complex simulation code. Then this metamodel is used as if it is the true simulation code in the NLSE method, which makes the problem computationally feasible. This ‘approximated’ NLSE method is described in this article. A Gaussian process model is used as a metamodel of complex simulation code. The proposed method is validated through a toy-model study where the true parameters are known a priori. An application to nuclear fusion device is presented.  相似文献   

19.
Computer experiments are often used as inexpensive alternatives to real-world experiments. Statistical metamodels of the computer model's input-output behavior can be constructed to serve as approximations of the response surface of the real-world system. The suitability of a metamodel depends in part on its intended use. While decision makers may want to understand the entire response surface, they may be particularly keen on finding interesting regions of the design space, such as where the gradient is steep. We present an adaptive, value-enhanced batch sequential algorithm that samples more heavily in such areas while still providing an understanding of the entire surface. The design points within each batch can be run in parallel to leverage modern multi-core computing assets. We illustrate our approach for deterministic computer models, but it has potential for stochastic simulation models as well.  相似文献   

20.
This paper presents an efficient metamodel building technique for solving collaborative optimization (CO) based on high fidelity models. The proposed method is based on a metamodeling concept, that is designed to simultaneously utilize computationally efficient (low fidelity) and expensive (high fidelity) models in an optimization process. A distinctive feature of the method is the utilization of interaction between low and high fidelity models in the construction of high quality metamodels both at the discipline level and system level of the CO. The low fidelity model is tuned in such a way that it approaches the same level of accuracy as the high fidelity model; but at the same time remains computational inexpensive. In this process, the tuned low fidelity models are used in the discipline level optimization process. In the system level, to handle the computational cost of the equality constraints in CO, model management strategy along with metamodeling technique are used. To determine the fidelity of metamodels, the predictive estimation of model fidelity method is applied. The developed method is demonstrated on a 2D Airfoil design problem, involving tightly coupled high fidelity structural and aerodynamic models. The results obtained show that the proposed method significantly reduces computational cost, and improves the convergence rate for solving the multidisciplinary optimization problem based on high fidelity models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号