首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 37 毫秒
1.
Reliability sensitivity analysis is used to find the rate of change in the probability of failure (or reliability) due to the changes in distribution parameters such as the means and standard deviations. Most of the existing reliability sensitivity analysis methods assume that all the probabilities and distribution parameters are precisely known. That is, every statistical parameter involved is perfectly determined. However, there are two types of uncertainties, epistemic and aleatory uncertainties that may not be perfectly determined in engineering practices. In this paper, both epistemic and aleatory uncertainties are considered in reliability sensitivity analysis and modeled using P-boxes. The proposed method is based on Monte Carlo simulation (MCS), weighted regression, interval algorithm and first order reliability method (FORM). We linearize original non-linear limit-state function by MCS rather than by expansion as a first order Taylor series at most probable point (MPP) because the MPP search is an iterative optimization process. Finally, we introduce an optimization model for sensitivity analysis under both aleatory and epistemic uncertainties. Four numerical examples are presented to demonstrate the proposed method.  相似文献   

2.
An important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we, therefore, propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decision-maker's attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process.  相似文献   

3.
In this paper, a simple but efficient concept of epistemic reliability index (ERI) is introduced for sampling uncertainty in input random variables under conditions where the input variables are independent Gaussian, and samples are unbiased. The increased uncertainty due to the added epistemic uncertainty requires a higher level of target reliability, which is called the conservative reliability index (CRI). In this paper, it is assumed that CRI can additively be decomposed into the aleatory part (the target reliability index) and the epistemic part (the ERI). It is shown theoretically and numerically that ERI remains same for different designs, which is critically important for computational efficiency in reliability-based design optimization. Novel features of the proposed ERI include: (a) it is unnecessary to have a double-loop uncertainty quantification for handling both aleatory and epistemic uncertainty; (b) the effect of two different sources of uncertainty can be separated so that designers can better understand the optimization outcome; and (c) the ERI needs to be calculated once and remains the same throughout the design process. The proposed method is demonstrated with two analytical and one numerical examples.  相似文献   

4.
张保强  陈梅玲  孙东阳  锁斌 《控制与决策》2020,35(10):2459-2465
针对时变系统的不确定性量化和传递问题,提出一种概率盒演化方法.根据系统的时变规律,获取系统响应的累积分布函数随时间变化的规律.将认知不确定性参数和随机不确定性参数分离在外层和内层,用蒙特卡洛法量化外层的认知不确定性参数,用基于随机配点的非嵌入式混沌多项式法量化内层的随机不确定性参数,通过求取不同时刻系统响应的累积分布函数的上下边界创建时变概率盒.最后,通过一延时电路性能退化算例验证所提出方法的有效性.研究表明,时变概率盒不仅可以表征系统特定时刻的混合不确定性,而且反映了输出响应的时变规律和输出不确定性随时间变化的趋势.  相似文献   

5.
针对随机与认知混合不确定性的概率盒灵敏度分析问题,提出一种利用概率盒缩减前后重叠面积作为不确定性度量的全局灵敏度分析方法.混合不确定性在航空航天仿真系统中广泛存在,概率盒方法用于随机与认知混合不确定性的表征在学术界已被广泛应用.首先,介绍传统概率盒灵敏度分析的不确定性缩减法理论,在此基础上,进一步考虑概率盒在位置和形状上的偏移量;然后,通过计算缩减前后的概率盒面积重叠量来表征各输入不确定性的影响程度,阐述其实施步骤;最后,通过数值算例对所提出方法与传统不确定性缩减方法进行全局灵敏度分析的对比和验证,并应用于发动机总体性能仿真计算灵敏度排序.研究结果表明,所提出面积重叠方法比传统不确定性缩减法适用范围更广,计算结果更准确.  相似文献   

6.
Reliability-based design optimization (RBDO) has been widely used to design engineering products with minimum cost function while meeting reliability constraints. Although uncertainties, such as aleatory uncertainty and epistemic uncertainty, have been well considered in RBDO, they are mainly considered for model input parameters. Model uncertainty, i.e., the uncertainty of model bias indicating the inherent model inadequacy for representing the real physical system, is typically overlooked in RBDO. This paper addresses model uncertainty approximation in a product design space and further integrates the model uncertainty into RBDO. In particular, a copula-based bias modeling approach is proposed and results are demonstrated by two vehicle design problems.  相似文献   

7.
The reliability analysis approach based on combined probability and evidence theory is studied in this paper to address the reliability analysis problem involving both aleatory uncertainties and epistemic uncertainties with flexible intervals (the interval bounds are either fixed or variable as functions of other independent variables). In the standard mathematical formulation of reliability analysis under mixed uncertainties with combined probability and evidence theory, the key is to calculate the failure probability of the upper and lower limits of the system response function as the epistemic uncertainties vary in each focal element. Based on measure theory, in this paper it is proved that the aforementioned upper and lower limits of the system response function are measurable under certain circumstances (the system response function is continuous and the flexible interval bounds satisfy certain conditions), which accordingly can be treated as random variables. Thus the reliability analysis of the system response under mixed uncertainties can be directly treated as probability calculation problems and solved by existing well-developed and efficient probabilistic methods. In this paper the popular probabilistic reliability analysis method FORM (First Order Reliability Method) is taken as an example to illustrate how to extend it to solve the reliability analysis problem in the mixed uncertainty situation. The efficacy of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem.  相似文献   

8.
Uncertainty comes in many forms in the real world and is an unavoidable component of human life. Generally, two types of uncertainties arise, namely, aleatory and epistemic uncertainty. Probability is a well established mathematical tool to handle aleatory uncertainty and fuzzy set theory is a tool to handle epistemic uncertainty. However, in certain situations, parameters of probability distributions may be tainted with epistemic uncertainty; and so, representation of parameters of probability distributions may be treated as fuzzy numbers (may be of different shapes). A probability box (P‐box) can be constructed when parameters are not precisely known. In this paper, an attempt has been made to construct families of P‐boxes when parameters of probability distributions are bell shaped or normal fuzzy numbers; and from these families of P‐boxes, membership functions are generated at different fractiles for different alpha levels.  相似文献   

9.
As a powerful design tool, Reliability Based Multidisciplinary Design Optimization (RBMDO) has received increasing attention to satisfy the requirement for high reliability and safety in complex and coupled systems. In many practical engineering design problems, design variables may consist of both discrete and continuous variables. Moreover, both aleatory and epistemic uncertainties may exist. This paper proposes the formula of RFCDV (Random/Fuzzy Continuous/Discrete Variables) Multidisciplinary Design Optimization (RFCDV-MDO), uncertainty analysis for RFCDV-MDO, and a method of RFCDV-MDO within the framework of Sequential Optimization and Reliability Assessment (RFCDV-MDO-SORA) to solve RFCDV-MDO problems. A mathematical problem and an engineering design problem are used to demonstrate the efficiency of the proposed method.  相似文献   

10.
Robustness-based design optimization under data uncertainty   总被引:2,自引:2,他引:0  
This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point data and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.  相似文献   

11.
Problems about the uncertainty in raw material compositions are a critical issue for the blending problems. It is feared that uncertainty in raw material compositions would often cause percent values of the actual blend to go out of specification limits. In this paper, the aleatory and epistemic uncertainties have been handled simultaneously in a blending optimization problem for brass casting. The aleatory and epistemic uncertainties are modeled by using probability and possibility theories respectively. However, the probabilistic and the possibilistic uncertainties are different from the each other. Therefore to solve the mathematical model, including these uncertainties, a transformation of any type of uncertainty to the other is needed. In this study, probabilistic uncertainties are transformed to the possibilistic uncertainties by considering Rong and Lahdelma’s (2008) and the Dubois, Prade, and Sandri (1993) and Dubois, Foulloy, Mauris, and Prade (2004) transformation approaches. This transformation process converts the former model to a possibilistic model. Then the possibilistic models, obtained from each transformation, are solved by using α cuts approach. The solutions of the two possibilistic models have shown that the model, which uses Dubois’s transformation, prepares blends with lower cost than the other model, which uses Rong and Lahdelma’s transformation.  相似文献   

12.
The need to differentiate between epistemic and aleatory uncertainties is now well admitted by the risk analysis community. One way to do so is to model aleatory uncertainty by classical probability distributions and epistemic uncertainty by means of possibility distributions, and then propagate them by their respective calculus. The result of this propagation is a random fuzzy variable. When dealing with complex models, the computational cost of such a propagation quickly becomes too high. In this paper, we propose a numerical approach, the Random/Fuzzy (RaFu) method, whose aim is to determine an optimal numerical strategy so that computational costs are reduced to their minimum, using the theoretical frameworks mentioned above. We also give some means to take account of the resulting numerical error. The benefits of the RaFu method are shown by comparing it to previously proposed methods.  相似文献   

13.
In practical engineering design, most data sets for system uncertainties are insufficiently sampled from unknown statistical distributions, known as epistemic uncertainty. Existing methods in uncertainty-based design optimization have difficulty in handling both aleatory and epistemic uncertainties. To tackle design problems engaging both epistemic and aleatory uncertainties, reliability-based design optimization (RBDO) is integrated with Bayes theorem. It is referred to as Bayesian RBDO. However, Bayesian RBDO becomes extremely expensive when employing the first- or second-order reliability method (FORM/SORM) for reliability predictions. Thus, this paper proposes development of Bayesian RBDO methodology and its integration to a numerical solver, the eigenvector dimension reduction (EDR) method, for Bayesian reliability analysis. The EDR method takes a sensitivity-free approach for reliability analysis so that it is very efficient and accurate compared with other reliability methods such as FORM/SORM. Efficiency and accuracy of the Bayesian RBDO process are substantially improved after this integration.  相似文献   

14.
An overview of a comprehensive framework is given for estimating the predictive uncertainty of scientific computing applications. The framework is comprehensive in the sense that it treats both types of uncertainty (aleatory and epistemic), incorporates uncertainty due to the mathematical form of the model, and it provides a procedure for including estimates of numerical error in the predictive uncertainty. Aleatory (random) uncertainties in model inputs are treated as random variables, while epistemic (lack of knowledge) uncertainties are treated as intervals with no assumed probability distributions. Approaches for propagating both types of uncertainties through the model to the system response quantities of interest are briefly discussed. Numerical approximation errors (due to discretization, iteration, and computer round off) are estimated using verification techniques, and the conversion of these errors into epistemic uncertainties is discussed. Model form uncertainty is quantified using (a) model validation procedures, i.e., statistical comparisons of model predictions to available experimental data, and (b) extrapolation of this uncertainty structure to points in the application domain where experimental data do not exist. Finally, methods for conveying the total predictive uncertainty to decision makers are presented. The different steps in the predictive uncertainty framework are illustrated using a simple example in computational fluid dynamics applied to a hypersonic wind tunnel.  相似文献   

15.
In this paper, a model validation framework is proposed and applied to a large vibro-acoustic finite element (FE) model of a passenger car. The framework introduces a p-box approach with an efficient quantification scheme of uncertainty sources and a new area metric which is relevant to the responses in the frequency domain. To prioritize the input uncertainties out of the enormous FE model, the experts’ knowledge is utilized to select candidate input parameters which have large potential influences on the response of interests (ROI) among several thousands of input parameters. Next, a variance-based sensitivity analysis with an orthogonal array is introduced in effort to quantify the influence of the selected input parameters on the ROIs. The employment of the eigenvector dimension reduction method and orthogonal combinations of interval-valued input parameters provides the p-box of the ROI even if the size of the FE model is very large. A color map and the u-pooling of the p-boxes over the frequency band as well as the p-box at different frequencies are introduced to assess the model error and quantitative contributions of the aleatory and the epistemic input uncertainties to the overall variability of the ROIs in the frequency domain. After assessing the model error, the FE model is updated. It was found that the sensitivity results and the experts’ knowledge about the associated components effectively determine the modifications of the component models and the input parameter values during the updating process.  相似文献   

16.
Linear systems whose coefficients have large uncertainties arise routinely in finite element calculations for structures with uncertain geometry, material properties, or loads. However, a true worst case analysis of the influence of such uncertainties was previously possible only for very small systems and uncertainties, or in special cases where the coefficients do not exhibit dependence. This paper presents a method for computing rigorous bounds on the solution of such systems, with a computable overestimation factor that is frequently quite small. The merits of the new approach are demonstrated by computing realistic bounds for some large, uncertain truss structures, some leading to linear systems with over 5000 variables and over 10000 interval parameters, with excellent bounds for up to about 10% input uncertainty. Also discussed are some counterexamples for the performance of traditional approximate methods for worst case uncertainty analysis.  相似文献   

17.
In the reliability analysis, input variables as well as the metamodel uncertainties are often encountered in practice. The input uncertainty includes the statistical uncertainty of the distribution parameters due to the lack of knowledge or insufficient data. Metamodel uncertainty arises when the response function is approximated by a surrogate function using a finite number of responses to reduce the costly computations. In this study, a reliability analysis procedure is proposed based on a Bayesian framework that can incorporate these uncertainties in an integrated manner into the form of posterior PDF. The PDF, often expressed by arbitrary functions, is evaluated via Markov Chain Monte Carlo (MCMC) method, which is an efficient simulation method to draw random samples that follow the distribution. In order to avoid the nested computation in the full Bayesian approach, a posterior predictive approach is employed, which requires only a single loop of reliability analysis. Gaussian process model is employed for the metamodel. Mathematical and engineering examples are used to demonstrate the proposed method. In the results, comparing with the full Bayesian approach, the predictive approach provides much less information, i.e., only a point estimate of the probability. Nevertheless, the predictive approach adequately accounts for the uncertainties with much less computation, which is more advantageous in the design practice. The smaller the data are provided, the higher the statistical uncertainty, leading to the higher (or lower) failure probability (or reliability).  相似文献   

18.
In this paper, we propose a new likelihood-based methodology to represent epistemic uncertainty described by sparse point and/or interval data for input variables in uncertainty analysis and design optimization problems. A worst-case maximum likelihood-based approach is developed for the representation of epistemic uncertainty, which is able to estimate the distribution parameters of a random variable described by sparse point and/or interval data. This likelihood-based approach is general and is able to estimate the parameters of any known probability distributions. The likelihood-based representation of epistemic uncertainty is then used in the existing framework for robustness-based design optimization to achieve computational efficiency. The proposed uncertainty representation and design optimization methodologies are illustrated with two numerical examples including a mathematical problem and a real engineering problem.  相似文献   

19.
Commonly, operational aspects of an industrial process are not included when evaluating the process environmental performance. These aspects are important as operational failures can intensify adverse environmental impacts or can diminish the chance of making any amelioration. This paper proposes to include these operational aspects by applying a method called Industrial Environmental Performance Evaluation. To have a reliable environmental performance measure for assisting policy-making in an organization, two types of uncertainty are considered in the proposed method. The first type is the epistemic uncertainty due to imperfect knowledge about the environmental impacts of the process. Epistemic uncertainty is considered by using the potential probability of material release during operating and non-operating periods of the process. The second type is aleatory uncertainty due to potential stochastic behaviour of the process. Aleatory uncertainty is modelled through a Markov-based model and is considered by the state probability distribution vectors. The proposed method is employed to analyze an existing formaldehyde production process as a case study. The analysis shows the relation between environmental and operational performances of the process. Process owners can use this analysis for improving the environmental and operational aspects of their process and achieve accuracy in their environmental decisions.  相似文献   

20.
Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behaviour of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary CDFs (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (e.g. interval analysis, possibility theory, evidence theory or probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterisations of epistemic uncertainty.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号