首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
In this paper, we propose a new likelihood-based methodology to represent epistemic uncertainty described by sparse point and/or interval data for input variables in uncertainty analysis and design optimization problems. A worst-case maximum likelihood-based approach is developed for the representation of epistemic uncertainty, which is able to estimate the distribution parameters of a random variable described by sparse point and/or interval data. This likelihood-based approach is general and is able to estimate the parameters of any known probability distributions. The likelihood-based representation of epistemic uncertainty is then used in the existing framework for robustness-based design optimization to achieve computational efficiency. The proposed uncertainty representation and design optimization methodologies are illustrated with two numerical examples including a mathematical problem and a real engineering problem.  相似文献   

2.
Uncertainty quantification accuracy of system performance has an important influence on the results of reliability-based design optimization (RBDO). A new uncertain identification and quantification methodology is proposed considering the strong statistical variables, sparse variables, and interval variables simultaneously. Maximum likelihood function and Akaike information criterion (AIC) methods are used to identify the best-fitted distribution types and distribution parameters of sparse variables. The interval variables are represented with evidence theory. Finally, a unified uncertainty quantification framework considering the three types of uncertain design variables is put forward, and then the failure probability of system performance is quantified with belief and plausibility measures. The Kriging metamodel and random sampling method are used to reduce the computational complexity. Three examples are illustrated to verify the effectiveness of the proposed methodology.  相似文献   

3.
Traditional formulations on reliability optimization problems have assumed that the coefficients of models are known as fixed quantities and reliability design problem is treated as deterministic optimization problems. Because that the optimal design of system reliability is resolved in the same stage of overall system design, model coefficients are highly uncertainty and imprecision during design phase and it is usually very difficult to determine the precise values for them. However, these coefficients can be roughly given as the intervals of confidence.

In this paper, we formulated reliability optimization problem as nonlinear goal programming with interval coefficients and develop a genetic algorithm to solve it. The key point is how to evaluate each solution with interval data. We give a new definition on deviation variables which take interval relation into account. Numerical example is given to demonstrate the efficiency of the proposed approach.  相似文献   


4.
An overview of a comprehensive framework is given for estimating the predictive uncertainty of scientific computing applications. The framework is comprehensive in the sense that it treats both types of uncertainty (aleatory and epistemic), incorporates uncertainty due to the mathematical form of the model, and it provides a procedure for including estimates of numerical error in the predictive uncertainty. Aleatory (random) uncertainties in model inputs are treated as random variables, while epistemic (lack of knowledge) uncertainties are treated as intervals with no assumed probability distributions. Approaches for propagating both types of uncertainties through the model to the system response quantities of interest are briefly discussed. Numerical approximation errors (due to discretization, iteration, and computer round off) are estimated using verification techniques, and the conversion of these errors into epistemic uncertainties is discussed. Model form uncertainty is quantified using (a) model validation procedures, i.e., statistical comparisons of model predictions to available experimental data, and (b) extrapolation of this uncertainty structure to points in the application domain where experimental data do not exist. Finally, methods for conveying the total predictive uncertainty to decision makers are presented. The different steps in the predictive uncertainty framework are illustrated using a simple example in computational fluid dynamics applied to a hypersonic wind tunnel.  相似文献   

5.
6.
Uncertainties exist in products or systems widely. In general, uncertainties are classified as epistemic uncertainty or aleatory uncertainty. This paper proposes a unified uncertainty analysis (UUA) method based on the mean value first order saddlepoint approximation (MVFOSPA), denoted as MVFOSPA-UUA, to estimate the systems probabilities of failure considering both epistemic and aleatory uncertainties simultaneously. In this method, the input parameters with epistemic uncertainty are modeled using interval variables while input parameters with aleatory uncertainty are modeled using probability distribution or random variables. In order to calculate the lower and upper bounds of system probabilities of failure, both the best case and the worst case scenarios of the system performance function need to be considered, and the proposed MVFOSPA-UUA method can handle these two cases easily. The proposed method is demonstrated to be more efficient, robust and in some situations more accurate than the existing methods such as uncertainty analysis based on the first order reliability method. The proposed method is demonstrated using several examples.  相似文献   

7.
The uncertainty information of design variables is included in the available representation data, and there are differences among representation data from different sources. Therefore, the paper proposes a nonparametric uncertainty representation method of design variables with different insufficient data from two sources. The Gaussian interpolation model for sparse sampling points and/or sparse sampling intervals from a single source is constructed through maximizing the logarithmic likelihood estimation function of insufficient data. The weight ratios of probability density values at sampling points are optimized through minimizing the total deviation of the fusion model, and the fusion Gaussian model is constructed based on the weight sum of the optimum probability density values of sampling points for Source 1 and Source 2. The methodology is extended to five different fusion conditions, which contain the fusion of uncertain distribution parameters, the fusion of insufficient data and interval data, etc. Five application examples are illustrated to verify the effectiveness of the proposed methodology.  相似文献   

8.

To improve the efficiency of solving uncertainty design optimization problems, a gradient-based optimization framework is herein proposed, which combines the dimension adaptive polynomial chaos expansion (PCE) and sensitivity analysis. The dimensional adaptive PCE is used to quantify the quantities of interest (e.g., reliability, robustness metrics) and the sensitivity. The dimensional adaptive property is inherited from the dimension adaptive sparse grid, which is used to evaluate the PCE coefficients. Robustness metrics, referred to as statistical moments, and their gradients with respect to design variables are easily derived from the PCE, whereas the evaluation of the reliability and its gradient require integrations. To quantify the reliability, the framework uses the Heaviside step function to eliminate the failure domain and calculates the integration by Monte Carlo simulation with the function replaced by PCE. The PCE is further combined with Taylor’s expansion and the finite difference to compute the reliability sensitivity. Since the design vector may affect the sample set determined by dimension adaptive sparse grid, the update of the sample set is controlled by the norm variations of the design vector. The optimization framework is formed by combining reliability, robustness quantification and sensitivity analysis, and the optimization module. The accuracy and efficiency of the reliability quantification, as well as the reliability sensitivity, are verified through a mathematical example, a system of springs, and a cantilever beam. The effectiveness of the framework in solving optimization problems is validated by multiple limit states example, a truss optimization example, an airfoil optimization example, and an ONERA M6 wing optimization problem. The results demonstrate that the framework can obtain accurate solutions at the expense of a manageable computational cost.

  相似文献   

9.
In this paper, a simple but efficient concept of epistemic reliability index (ERI) is introduced for sampling uncertainty in input random variables under conditions where the input variables are independent Gaussian, and samples are unbiased. The increased uncertainty due to the added epistemic uncertainty requires a higher level of target reliability, which is called the conservative reliability index (CRI). In this paper, it is assumed that CRI can additively be decomposed into the aleatory part (the target reliability index) and the epistemic part (the ERI). It is shown theoretically and numerically that ERI remains same for different designs, which is critically important for computational efficiency in reliability-based design optimization. Novel features of the proposed ERI include: (a) it is unnecessary to have a double-loop uncertainty quantification for handling both aleatory and epistemic uncertainty; (b) the effect of two different sources of uncertainty can be separated so that designers can better understand the optimization outcome; and (c) the ERI needs to be calculated once and remains the same throughout the design process. The proposed method is demonstrated with two analytical and one numerical examples.  相似文献   

10.
In this paper, we consider storage loading problems under uncertainty where the storage area is organized in fixed stacks with a limited height. Such problems appear in several practical applications, e.g., when loading container terminals, container ships or warehouses. Incoming items arriving at a partly filled storage area have to be assigned to stacks under the restriction that not every item may be stacked on top of every other item and taking into account that some items with uncertain data will arrive later. Following the robust optimization paradigm, we propose different MIP formulations for the strictly and adjustable robust counterparts of the uncertain problem. Furthermore, we show that in the case of interval uncertainties the computational effort to find adjustable robust solutions can be reduced. Computational results are presented for randomly generated instances with up to 480 items. The results show that instances of this size can be solved in reasonable time and that including robustness improves solutions where uncertainty is not taken into account.  相似文献   

11.
A new sparse grid based method for uncertainty propagation   总被引:2,自引:2,他引:0  
Current methods for uncertainty propagation suffer from their limitations in providing accurate and efficient solutions to high-dimension problems with interactions of random variables. The sparse grid technique, originally invented for numerical integration and interpolation, is extended to uncertainty propagation in this work to overcome the difficulty. The concept of Sparse Grid Numerical Integration (SGNI) is extended for estimating the first two moments of performance in robust design, while the Sparse Grid Interpolation (SGI) is employed to determine failure probability by interpolating the limit-state function at the Most Probable Point (MPP) in reliability analysis. The proposed methods are demonstrated by high-dimension mathematical examples with notable variate interactions and one multidisciplinary rocket design problem. Results show that the use of sparse grid methods works better than popular counterparts. Furthermore, the automatic sampling, special interpolation process, and dimension-adaptivity feature make SGI more flexible and efficient than using the uniform sample based metamodeling techniques.  相似文献   

12.
The reliability analysis approach based on combined probability and evidence theory is studied in this paper to address the reliability analysis problem involving both aleatory uncertainties and epistemic uncertainties with flexible intervals (the interval bounds are either fixed or variable as functions of other independent variables). In the standard mathematical formulation of reliability analysis under mixed uncertainties with combined probability and evidence theory, the key is to calculate the failure probability of the upper and lower limits of the system response function as the epistemic uncertainties vary in each focal element. Based on measure theory, in this paper it is proved that the aforementioned upper and lower limits of the system response function are measurable under certain circumstances (the system response function is continuous and the flexible interval bounds satisfy certain conditions), which accordingly can be treated as random variables. Thus the reliability analysis of the system response under mixed uncertainties can be directly treated as probability calculation problems and solved by existing well-developed and efficient probabilistic methods. In this paper the popular probabilistic reliability analysis method FORM (First Order Reliability Method) is taken as an example to illustrate how to extend it to solve the reliability analysis problem in the mixed uncertainty situation. The efficacy of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem.  相似文献   

13.
14.
This paper develops a new mixed uncertainty robust optimization (MURO) method with both random and interval uncertainties. Existing strategies in literature always treat the system performance as a sequence of probability distribution with the interval factors varying within their domains. Moreover, the robust design objective and constraints are modeled in the form of combination of interval mean and interval deviations of performances, which cannot offer a quantitative robustness measurement of a design. The new MURO method is based on the sensitivity region concept and a hybrid robustness index is developed to represent the possibility that the uncertain vector locates within the worst-case sensitivity region (WCSR). This proposed index can offer a more quantitative and intuitive way to evaluate the robustness of a design. With the hybrid indices, the traditional robust optimization problem can be converted to an ordinary optimization with the robustness index constraints. Two numerical examples and two engineering examples with different combinations of interval and random factors are illustrated to demonstrate the applicability and efficiency of the proposed algorithm. The comparison results show that the new method can reduce the conservatism of previous method significantly with fewer computational efforts.  相似文献   

15.
在很多信息处理任务中,人们容易获得大量的无标签样本,但对样本进行标注是非常费时和费力的。作为机器学习领域中一种重要的学习方法,主动学习通过选择最有信息量的样本进行标注,减少了人工标注的代价。然而,现有的大多数主动学习算法都是基于分类器的监督学习方法,这类算法并不适用于无任何标签信息的样本选择。针对这个问题,借鉴最优实验设计的算法思想,结合自适应稀疏邻域重构理论,提出基于自适应稀疏邻域重构的主动学习算法。该算法可以根据数据集各区域的不同分布自适应地选择邻域规模,同步完成邻域点的搜寻和重构系数的计算,能在无任何标签信息的情况下较好地选择最能代表样本集分布结构的样本。基于人工合成数据集和真实数据集的实验表明,在同等标注代价下,基于自适应稀疏邻域重构的主动学习算法在分类精度和鲁棒性上具有较高的性能。  相似文献   

16.
Reliability-based design optimization (RBDO) has been widely used to design engineering products with minimum cost function while meeting reliability constraints. Although uncertainties, such as aleatory uncertainty and epistemic uncertainty, have been well considered in RBDO, they are mainly considered for model input parameters. Model uncertainty, i.e., the uncertainty of model bias indicating the inherent model inadequacy for representing the real physical system, is typically overlooked in RBDO. This paper addresses model uncertainty approximation in a product design space and further integrates the model uncertainty into RBDO. In particular, a copula-based bias modeling approach is proposed and results are demonstrated by two vehicle design problems.  相似文献   

17.
基于最小外切矩形(MBR)的多边形内点生成算法在奇异情况下容易失效。针对该问题,引入矢量数据的不确定性区间,提出一种改进的多边形数据内点自动生成算法。采用不确定性区间和相交区间的处理方法对奇异情况进行统一修正,避免MBR算法对于切割线与节点相交情况的过多异常处理和分支结构。通过对比实验验证了该算法的健壮性和高效性。  相似文献   

18.
In engineering design, to achieve high reliability and safety in complex and coupled systems (e.g., Multidisciplinary Systems), Reliability Based Multidisciplinary Design Optimization (RBMDO) has been received increasing attention. If there are sufficient data of uncertainties to construct the probability distribution of each input variable, the RBMDO can efficiently deal with the problem. However there are both Aleatory Uncertainty (AU) and Epistemic Uncertainty (EU) in most Multidisciplinary Systems (MS). In this situation, the results of the RBMDO will be unreliable or risky because there are insufficient data to precisely construct the probability distribution about EU due to time, money, etc. This paper proposes formulations of Mixed Variables (random and fuzzy variables) Multidisciplinary Design Optimization (MVMDO) and a method of MVMDO within the framework of Sequential Optimization and Reliability Assessment (MVMDO-SORA). The MVMDO overcomes difficulties caused by insufficient information for uncertainty. The proposed method enables designers to solve MDO problems in the presence of both AU and EU. Besides, the proposed method can efficiently reduce the computational demand. Examples are used to demonstrate the proposed formulations and the efficiency of MVMDO-SORA.  相似文献   

19.
In this paper, a novel algorithm is proposed to achieve robust high resolution detection in sparse multipath channels. Currently used sparse reconstruction techniques are not immediately applicable in multipath channel modeling. Performance of standard compressed sensing formulations based on discretization of the multipath channel parameter space degrade significantly when the actual channel parameters deviate from the assumed discrete set of values. To alleviate this off-grid problem, we make use of the particle swarm optimization (PSO) to perturb each grid point that reside in each multipath component cluster. Orthogonal matching pursuit (OMP) is used to reconstruct sparse multipath components in a greedy fashion. Extensive simulation results quantify the performance gain and robustness obtained by the proposed algorithm against the off-grid problem faced in sparse multipath channels.  相似文献   

20.
In order to analyze the effect of the epistemic uncertainty of random variables’ distribution parameters on the safety of the structure system, a novel sensitivity measure of the failure probability is constructed by integrating the derivative of failure probability in the parameter space. Compared with the variance based sensitivity index, the new derivative based sensitivity measure can be evaluated with less computational cost. It is noticed that the ranking of the new derivative based sensitivity measure is the same as that of the variance based one. For the problem of the variance based sensitivity index with large computational cost, the standard Sobol’s method is employed, and the quasi-Monte Carlo method and double-loop point estimate method are then utilized to compute the derivative based sensitivity measure for comparison. Four examples are employed to demonstrate the reasonability of the proposed sensitivity measure and the efficiency of the proposed method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号