首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

2.
Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster-Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.  相似文献   

3.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

4.
Assessing the failure probability of a thermal–hydraulic (T–H) passive system amounts to evaluating the uncertainties in its performance. Two different sources of uncertainties are usually considered: randomness due to inherent variability in the system behavior (aleatory uncertainty) and imprecision due to lack of knowledge and information on the system (epistemic uncertainty).In this paper, we are concerned with the epistemic uncertainties affecting the model of a T–H passive system and the numerical values of its parameters. Due to these uncertainties, the system may find itself in working conditions that do not allow it to accomplish its functions as required. The estimation of the probability of these functional failures can be done by Monte Carlo (MC) sampling of the epistemic uncertainties affecting the model and its parameters, followed by the computation of the system function response by a mechanistic T–H code.Efficient sampling methods are needed for achieving accurate estimates, with reasonable computational efforts. In this respect, the recently developed Line Sampling (LS) method is here considered for improving the MC sampling efficiency. The method, originally developed to solve high-dimensional structural reliability problems, employs lines instead of random points in order to probe the failure domain of interest. An “important direction” is determined, which points towards the failure domain of interest; the high-dimensional reliability problem is then reduced to a number of conditional one-dimensional problems which are solved along the “important direction”. This allows to significantly reduce the variance of the failure probability estimator, with respect to standard random sampling.The efficiency of the method is demonstrated by comparison to the commonly adopted Latin Hypercube Sampling (LHS) and first-order reliability method (FORM) in an application of functional failure analysis of a passive decay heat removal system in a gas-cooled fast reactor (GFR) of literature.  相似文献   

5.
Earthquake loss estimation procedures exhibit aleatory and epistemic uncertainty imbedded in their various components; i.e. seismic hazard, structural fragility, and inventory data. Since these uncertainties significantly affect decision-making, they have to be considered in loss estimation to inform decision- and policymakers and to ensure a balanced view of the various threats to which society may be subjected. This paper reviews the uncertainties that affect earthquake loss estimation and proposes a simple framework for probabilistic uncertainty assessment suitable for use after obtaining impact results from existing software, such as HAZUS-MH. To avoid the extensive calculations required for Monte Carlo simulation-based approaches, this study develops an approximate method for uncertainty propagation based on modifying the quantile arithmetic methodology, which allows for acceptable uncertainty estimates with limited computational effort. A verification example shows that the results by the approximation approach are in good agreement with the equivalent Monte Carlo simulation outcome. Finally, the paper demonstrates the proposed procedure for probabilistic loss assessment through a comparison with HAZUS-MH results. It is confirmed that the proposed procedure consistently gives reasonable estimates.  相似文献   

6.
Optimization leads to specialized structures which are not robust to disturbance events like unanticipated abnormal loading or human errors. Typical reliability-based and robust optimization mainly address objective aleatory uncertainties. To date, the impact of subjective epistemic uncertainties in optimal design has not been comprehensively investigated. In this paper, we use an independent parameter to investigate the effects of epistemic uncertainties in optimal design: the latent failure probability. Reliability-based and risk-based truss topology optimization are addressed. It is shown that optimal risk-based designs can be divided in three groups: (A) when epistemic uncertainty is small (in comparison to aleatory uncertainty), the optimal design is indifferent to it and yields isostatic structures; (B) when aleatory and epistemic uncertainties are relevant, optimal design is controlled by epistemic uncertainty and yields hyperstatic but nonredundant structures, for which expected costs of direct collapse are controlled; (C) when epistemic uncertainty becomes too large, the optimal design becomes redundant, as a way to control increasing expected costs of collapse. The three regions above are divided by hyperstatic and redundancy thresholds. The redundancy threshold is the point where the structure needs to become redundant so that its reliability becomes larger than the latent reliability of the simplest isostatic system. Simple truss topology optimization is considered herein, but the conclusions have immediate relevance to the optimal design of realistic structures subject to aleatory and epistemic uncertainties.  相似文献   

7.
Epistemic and aleatory uncertain variables always exist in multidisciplinary system simultaneously and can be modeled by probability and evidence theories, respectively. The propagation of uncertainty through coupled subsystem and the strong nonlinearity of the multidisciplinary system make the reliability analysis difficult and computational cost expensive. In this paper, a novel reliability analysis procedure is proposed for multidisciplinary system with epistemic and aleatory uncertain variables. First, the probability density function of the aleatory variables is assumed piecewise uniform distribution based on Bayes method, and approximate most probability point is solved by equivalent normalization method. Then, important sampling method is used to calculate failure probability and its variance and variation coefficient. The effectiveness of the procedure is demonstrated by two numerical examples. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

8.
The paper describes an approach to representing, aggregating and propagating aleatory and epistemic uncertainty through computational models. The framework for the approach employs the theory of imprecise coherent probabilities. The approach is exemplified by a simple algebraic system, the inputs of which are uncertain. Six different uncertainty situations are considered, including mixtures of epistemic and aleatory uncertainty.  相似文献   

9.
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project.  相似文献   

10.
The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer–inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.  相似文献   

11.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

12.
By means of several examples from a recent comprehensive space nuclear risk analysis of the Cassini mission, a scenario and consequence representational framework is presented for risk analysis of space nuclear power systems in the context of epistemic and aleatory uncertainties. The framework invites the use of probabilistic models for the calculation of both event probabilities and scenario consequences. Each scenario is associated with a frequency that may include both aleatory and epistemic uncertainties. The outcome of each scenario is described in terms of an end state vector. The outcome of each scenario is also characterized by a source term. In this paper, the source term factors of interest are number of failed clads in the space nuclear power system, amount of fuel released and amount of fuel that is potentially respirable. These are also subject to uncertainties. The 1990 work of Apostolakis is found to be a useful formalism from which to derive the relevant probabilistic models. However, an extension to the formalism was necessary to accommodate the situation in which aleatory uncertainty is represented by changes in the form of the probability function itself, not just its parameters. Event trees that show reasonable alternative accident scenarios are presented. A grouping of probabilities and consequences is proposed as a useful structure for thinking about uncertainties. An example of each category is provided. Concluding observations are made about the judgments involved in this analysis of uncertainties and the effect of distinguishing between aleatory and epistemic uncertainties.  相似文献   

13.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

14.
The traditional reliability analysis method based on probabilistic method requires probability distributions of all the uncertain parameters. However, in practical applications, the distributions of some parameters may not be precisely known due to the lack of sufficient sample data. The probabilistic theory cannot directly measure the reliability of structures with epistemic uncertainty, ie, subjective randomness and fuzziness. Hence, a hybrid reliability analysis (HRA) problem will be caused when the aleatory and epistemic uncertainties coexist in a structure. In this paper, by combining the probability theory and the uncertainty theory into a chance theory, a probability‐uncertainty hybrid model is established, and a new quantification method based on the uncertain random variables for the structural reliability is presented in order to simultaneously satisfy the duality of random variables and the subadditivity of uncertain variables; then, a reliability index is explored based on the chance expected value and variance. Besides, the formulas of the chance theory‐based reliability and reliability index are derived to uniformly assess the reliability of structures under the hybrid aleatory and epistemic uncertainties. The numerical experiments illustrate the validity of the proposed method, and the results of the proposed method can provide a more accurate assessment of the structural system under the mixed uncertainties than the ones obtained separately from the probability theory and the uncertainty theory.  相似文献   

15.
The following techniques for uncertainty and sensitivity analysis are briefly summarized: Monte Carlo analysis, differential analysis, response surface methodology, Fourier amplitude sensitivity test, Sobol' variance decomposition, and fast probability integration. Desirable features of Monte Carlo analysis in conjunction with Latin hypercube sampling are described in discussions of the following topics: (i) properties of random, stratified and Latin hypercube sampling, (ii) comparisons of random and Latin hypercube sampling, (iii) operations involving Latin hypercube sampling (i.e. correlation control, reweighting of samples to incorporate changed distributions, replicated sampling to test reproducibility of results), (iv) uncertainty analysis (i.e. cumulative distribution functions, complementary cumulative distribution functions, box plots), (v) sensitivity analysis (i.e. scatterplots, regression analysis, correlation analysis, rank transformations, searches for nonrandom patterns), and (vi) analyses involving stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty.  相似文献   

16.
Reliability analysis with both aleatory and epistemic uncertainties is investigated in this paper. The aleatory uncertainties are described with random variables, and epistemic uncertainties are tackled with evidence theory. To estimate the bounds of failure probability, several methods have been proposed. However, the existing methods suffer the dimensionality challenge of epistemic variables. To get rid of this challenge, a so‐called random‐set based Monte Carlo simulation (RS‐MCS) method derived from the theory of random sets is offered. Nevertheless, RS‐MCS is also computational expensive. So an active learning Kriging (ALK) model that only rightly predicts the sign of performance function is introduced and closely integrated with RS‐MCS. The proposed method is termed as ALK‐RS‐MCS. ALK‐RS‐MCS accurately predicts the bounds of failure probability using as few function calls as possible. Moreover, in ALK‐RS‐MCS, an optimization method based on Karush–Kuhn–Tucker conditions is proposed to make the estimation of failure probability interval more efficient based on the Kriging model. The efficiency and accuracy of the proposed approach are demonstrated with four examples. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

17.
Error and uncertainty in modeling and simulation   总被引:1,自引:0,他引:1  
This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.  相似文献   

18.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

19.
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results.  相似文献   

20.
提出了基于贝叶斯理论的地震风险评估方法,综合考虑了地震危险性模型、输入地震动记录、结构参数和需求模型的不确定性,并以云南大理地区1970年-2017年间的地震数据为研究基础进行了详细讨论。在传统基于概率地震危险性分析方法的基础上,提出了基于贝叶斯理论的地震危险性分析方法,通过贝叶斯更新准则,确定了地震概率模型中未知参数的后验概率分布;通过贝叶斯理论建立了基于概率的地震需求模型,并在易损性中考虑了需求模型认知不确定性的影响;以42层钢框架-RC核心筒建筑为例,开展了地震作用下的风险评估。研究表明:基于贝叶斯理论的地震危险性分析方法,能够获得更为合理的危险性模型;忽略需求模型中参数不确定性的影响,将错误估计结构的地震易损性;不同加载工况将对高层建筑的地震风险产生显著影响。提出的概率风险评估方法,提供了可以考虑固有不确定性和认知不确定性的有效途径,有助于推动高性能结构地震韧性评价和设计理论的发展。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号