首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
It is important to design robust and reliable systems by accounting for uncertainty and variability in the design process. However, performing optimization in this setting can be computationally expensive, requiring many evaluations of the numerical model to compute statistics of the system performance at every optimization iteration. This paper proposes a multifidelity approach to optimization under uncertainty that makes use of inexpensive, low‐fidelity models to provide approximate information about the expensive, high‐fidelity model. The multifidelity estimator is developed based on the control variate method to reduce the computational cost of achieving a specified mean square error in the statistic estimate. The method optimally allocates the computational load between the two models based on their relative evaluation cost and the strength of the correlation between them. This paper also develops an information reuse estimator that exploits the autocorrelation structure of the high‐fidelity model in the design space to reduce the cost of repeatedly estimating statistics during the course of optimization. Finally, a combined estimator incorporates the features of both the multifidelity estimator and the information reuse estimator. The methods demonstrate 90% computational savings in an acoustic horn robust optimization example and practical design turnaround time in a robust wing optimization problem. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

2.
In model-based process optimization one uses a mathematical model to optimize a certain criterion, for example the product yield of a chemical process. Models often contain parameters that have to be estimated from data. Typically, a point estimate (e.g. the least squares estimate) is used to fix the model for the optimization stage. However, parameter estimates are uncertain due to incomplete and noisy data. In this article, it is shown how parameter uncertainty can be taken into account in process optimization. To quantify the uncertainty, Markov Chain Monte Carlo (MCMC) sampling, an emerging standard approach in Bayesian estimation, is used. In the Bayesian approach, the solution to the parameter estimation problem is given as a distribution, and the optimization criteria are functions of that distribution. The formulation and implementation of the optimization is studied, and numerical examples are used to show that parameter uncertainty can have a large effect in optimization results.  相似文献   

3.
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSIC). Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify their impact on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a Monte Carlo double-loop, requires a large number of model evaluations, which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a Monte Carlo single-loop with a limited calculation budget. First, we build a unique sample of inputs and simulator outputs, from a well-chosen probability distribution of inputs. From this sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Subsequently, we define 2nd-level HSIC-based measures between the distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.  相似文献   

4.
Inserts are commonly used to transfer loads to sandwich composite structures. Local stress concentrations due to inserts are known to cause structural failure, and experimental pull-out tests show that the failure load can vary by 20% between batches of sandwich panels. Clearly, uncertainty in the mechanical properties of the constituent materials needs to be addressed in the design and optimization of sandwich panel inserts. In this paper, we explore the utility of reliability analysis in design, applying Monte Carlo sampling, the First Order Reliability Method (FORM), line sampling, and subset simulation to a one-dimensional model of an insert in a homogenized sandwich panel. We observe that for systems with very low failure probabilities, subset simulation is the most efficient method for calculating the probability of structural failure, but in general, Monte Carlo sampling is more effective than the advanced reliability analysis techniques.  相似文献   

5.
The analysis and optimization of complex multiphysics systems presents a series of challenges that limit the practical use of computational tools. Specifically, the optimization of such systems involves multiple interconnected components with competing quantities of interest and high‐dimensional spaces and necessitates the use of costly high‐fidelity solvers to accurately simulate the coupled multiphysics. In this paper, we put forth a data‐driven framework to address these challenges leveraging recent advances in machine learning. We combine multifidelity Gaussian process regression and Bayesian optimization to construct probabilistic surrogate models for given quantities of interest and explore high‐dimensional design spaces in a cost‐effective manner. The synergistic use of these computational tools gives rise to a tractable and general framework for tackling realistic multidisciplinary optimization problems. To demonstrate the specific merits of our approach, we have chosen a challenging large‐scale application involving the hydrostructural optimization of three‐dimensional supercavitating hydrofoils. To this end, we have developed an automated workflow for performing multiresolution simulations of turbulent multiphase flows and multifidelity structural mechanics (combining three‐dimensional and one‐dimensional finite element results), the results of which drive our machine learning analysis in pursuit of the optimal hydrofoil shape.  相似文献   

6.
The numerical solution of a nonlinear chance constrained optimization problem poses a major challenge. The idea of back-mapping as introduced by M. Wendt, P. Li and G. Wozny in 2002 is a viable approach for transforming chance constraints on output variables (of unknown distribution) into chance constraints on uncertain input variables (of known distribution) based on a monotony relation. Once transformation of chance constraints has been accomplished, the resulting optimization problem can be solved by using a gradient-based algorithm. However, the computation of values and gradients of chance constraints and the objective function involves the evaluation of multi-dimensional integrals, which is computationally very expensive. This study proposes an easy-to-use method for analysing monotonic relations between constrained outputs and uncertain inputs. In addition, sparse-grid integration techniques are used to reduce the computational time decisively. Two examples from process optimization under uncertainty demonstrate the performance of the proposed approach.  相似文献   

7.
Reliability-based design of a system often requires the minimization of the probability of system failure over the admissible space for the design variables. For complex systems this probability can rarely be evaluated analytically and so it is often calculated using stochastic simulation techniques, which involve an unavoidable estimation error and significant computational cost. These features make efficient reliability-based optimal design a challenging task. A new method called Stochastic Subset Optimization (SSO) is proposed here for iteratively identifying sub-regions for the optimal design variables within the original design space. An augmented reliability problem is formulated where the design variables are artificially considered as uncertain and Markov Chain Monte Carlo techniques are implemented in order to simulate samples of them that lead to system failure. In each iteration, a set with high likelihood of containing the optimal design parameters is identified using a single reliability analysis. Statistical properties for the identification and stopping criteria for the iterative approach are discussed. For problems that are characterized by small sensitivity around the optimal design choice, a combination of SSO with other optimization algorithms is proposed for enhanced overall efficiency.  相似文献   

8.
The aerodynamic performance of a compressor is highly sensitive to uncertain working conditions. This paper presents an efficient robust aerodynamic optimization method on the basis of nondeterministic computational fluid dynamic (CFD) simulation and multi‐objective genetic algorithm (MOGA). A nonintrusive polynomial chaos method is used in conjunction with an existing well‐verified CFD module to quantify the uncertainty propagation in the flow field. This method is validated by comparing with a Monte Carlo method through full 3D CFD simulations on an axial compressor (National Aeronautics and Space Administration rotor 37). On the basis of the validation, the nondeterministic CFD is coupled with a surrogate‐based MOGA to search for the Pareto front. A practical engineering application is implemented to the robust aerodynamic optimization of rotor 37 under random outlet static pressure. Two curve angles and two sweep angles at tip and hub are used as design variables. Convergence analysis shows that the surrogate‐based MOGA can obtain the Pareto front properly. Significant improvements of both mean and variance of the efficiency are achieved by the robust optimization. The comparison of the robust optimization results with that of the initial design, and a deterministic optimization demonstrate that the proposed method can be applied to turbomachinery successfully. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

9.
The evaluation of probabilistic constraints plays an important role in reliability-based design optimization. Traditional simulation methods such as Monte Carlo simulation can provide highly accurate results, but they are often computationally intensive to implement. To improve the computational efficiency of the Monte Carlo method, this article proposes a particle splitting approach, a rare-event simulation technique that evaluates probabilistic constraints. The particle splitting-based reliability assessment is integrated into the iterative steps of design optimization. The proposed method provides an enhancement of subset simulation by increasing sample diversity and producing a stable solution. This method is further extended to address the problem with multiple probabilistic constraints. The performance of the particle splitting approach is compared with the most probable point based method and other approximation methods through examples.  相似文献   

10.
Two types of sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies. These plans are shown to be improvements over simple random sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.  相似文献   

11.
This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.  相似文献   

12.
Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories.After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems.  相似文献   

13.
In this paper, the most conservative Tsai–Wu failure envelopes are obtained for laminated composite considering material as well as ply angle uncertainty. The uncertainty analysis is performed using Monte Carlo simulation (MCS). The obtained failure envelopes are then used as the constraint functions to perform the minimum weight design optimization problem using particle swarm optimization (PSO). Results show increase in weight of the laminate from the deterministic results and it varies from 4% to 50% depending upon the stacking sequence and loading condition. Substantial effects of uncertainty on the failure envelope and optimal design are quantified.  相似文献   

14.
A slat track, structural component of an aircraft wing that transfers the aerodynamical loads, excited by operational forces can result in excessive displacement levels if not properly designed. The design parameter values are not always precisely known but can contain a level of uncertainty to some extent due to, for example dimensional variation. During the different optimization approaches, the slat track geometry is optimized in order to limit the maximum vertical displacement, taking into account the variability of the design parameters. Application and comparison of different optimal, robust and generalized optimization approaches is presented and applied on the slat track finite element model, making use of mean and variance response functions to model the uncertainty on the finite element displacement values. Next to validating different objective function statements, a comparison is also made on the level of accuracy and practicability concerning the different response function models, based on regression techniques and Monte Carlo simulations, optimization and transmissibilities and regressive techniques and vibration reduction over a frequency range. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

15.
The primary goal of robust parameter design (RPD) is to determine the optimum operating conditions that achieve process performance targets while minimizing variability in the results. To achieve this goal, typical approaches to RPD problems use ordinary least squares methods to obtain response functions for the mean and variance by assuming that the experimental data follow a normal distribution and are relatively free of contaminants or outliers. Consequently, the most common estimators used in the initial tier of estimation are the sample mean and sample variance, as they are very good estimators when these assumptions hold. However, it is often the case that such assumed conditions do not exist in practice; notably, that inherent asymmetry pervades system outputs. If unaccounted for, such conditions can affect results tremendously by causing the quality of the estimates obtained using the sample mean and standard deviation to deteriorate. Focusing on asymmetric conditions, this paper examines several highly efficient estimators as alternatives to the sample mean and standard deviation. We then incorporate these estimators into RPD modeling and optimization approaches to ascertain which estimators tend to yield better solutions when skewness exists. Monte Carlo simulation and numerical studies are used to substantiate and compare the performance of the proposed methods with the traditional approach. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

16.
结构可靠度计算的近似重要性抽样方法及其应用   总被引:6,自引:1,他引:5  
本文提出一种近似重要性抽样方法来计算结构可靠度。该方法将结构可靠度计算中的重要性抽样方法与结构优化中的近似重分析技术相结合,并引入误差带的技巧,显著减少了蒙特卡罗方法的计算量。运用该方法于结构正常使用极限状态可靠度分析的实例,表明了该方法用于复杂结构可靠度计算的有效性  相似文献   

17.
A novel procedure for estimating the relative importance of uncertain parameters of complex FE model is presented. The method is specifically directed toward problems involving high-dimensional input parameter spaces, as they are encountered during uncertainty analysis of large scale, refined FE models. In these cases one is commonly faced with thousands of uncertain parameters and traditional techniques, e.g. finite difference or direct differentiation methods become expensive. In contrast, the presented method quickly filters out the most influential variables. Hence, the main objective is not to compute the sensitivity but to identify those parameters whose random variations have the biggest influence on the response. This is achieved by generating a set of samples with direct Monte Carlo simulation, which are closely scattered around the point at which the relative importance measures are sought. From these samples, estimators of the relative importance are synthesized and the most important ones are refined with a method of choice. In this paper, the underlying theory as well as the resulting algorithm is presented.  相似文献   

18.
The aim of this paper was to present a topology optimization methodology for obtaining robust designs insensitive to small uncertainties in the geometry. The variations are modeled using a stochastic field. The model can represent spatially varying geometry imperfections in devices produced by etching techniques. Because of under‐etching or over‐etching parts of the structure may become thinner or thicker than a reference design supplied to the manufacturer. The uncertainties are assumed to be small and their influence on the system response is evaluated using perturbation techniques. Under the above assumptions, the proposed algorithm provides a computationally cheap alternative to previously introduced stochastic optimization methods based on Monte Carlo sampling. The method is demonstrated on the design of a minimum compliance cantilever beam and a compliant mechanism. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

19.
Coherent distortion risk measures are applied to capture the possible violation of a restriction in linear optimization problems whose parameters are uncertain. Each risk constraint induces an uncertainty set of coefficients, which is proved to be a weighted-mean trimmed region. Thus, given a sample of the coefficients, an uncertainty set is a convex polytope that can be exactly calculated. We construct an efficient geometrical algorithm to solve stochastic linear programs that have a single distortion risk constraint. The algorithm is available as an R-package. The algorithm’s asymptotic behavior is also investigated, when the sample is i.i.d. from a general probability distribution. Finally, we present some computational experience.  相似文献   

20.
应用新安江模型进行水文模拟时,由于模型本身的不足及参数多、信息量少等原因,会出现率定的最优参数组不唯一、不稳定等问题。考虑到以往的参数优选,都只得出一个参数组,不能反映出其不确定性状况。提出应用基于马尔可夫链蒙特卡罗(MCMC)理论的SCEM-UA算法,通过双牌流域以1 h为时段间隔的36场典型洪水数据对新安江模型参数进行优选和不确定性评估。结果表明,该算法能很好地推出新安江模型参数的后验概率分布;率定和检验结果分析也表明,应用SCEM-UA算法对新安江模型进行优选和不确定评估是有效和可行的。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号