首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

2.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

3.
Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster-Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.  相似文献   

4.
The paper describes an approach to representing, aggregating and propagating aleatory and epistemic uncertainty through computational models. The framework for the approach employs the theory of imprecise coherent probabilities. The approach is exemplified by a simple algebraic system, the inputs of which are uncertain. Six different uncertainty situations are considered, including mixtures of epistemic and aleatory uncertainty.  相似文献   

5.
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project.  相似文献   

6.
Earthquake loss estimation procedures exhibit aleatory and epistemic uncertainty imbedded in their various components; i.e. seismic hazard, structural fragility, and inventory data. Since these uncertainties significantly affect decision-making, they have to be considered in loss estimation to inform decision- and policymakers and to ensure a balanced view of the various threats to which society may be subjected. This paper reviews the uncertainties that affect earthquake loss estimation and proposes a simple framework for probabilistic uncertainty assessment suitable for use after obtaining impact results from existing software, such as HAZUS-MH. To avoid the extensive calculations required for Monte Carlo simulation-based approaches, this study develops an approximate method for uncertainty propagation based on modifying the quantile arithmetic methodology, which allows for acceptable uncertainty estimates with limited computational effort. A verification example shows that the results by the approximation approach are in good agreement with the equivalent Monte Carlo simulation outcome. Finally, the paper demonstrates the proposed procedure for probabilistic loss assessment through a comparison with HAZUS-MH results. It is confirmed that the proposed procedure consistently gives reasonable estimates.  相似文献   

7.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

8.
余波  陈冰  唐睿楷 《工程力学》2018,35(5):170-179
传统的钢筋混凝土(RC)梁抗剪承载力模型属于确定性模型,难以有效考虑几何尺寸、材料特性、边界约束条件等因素存在的客观(物理)不确定性和在模型推导过程中存在的主观(模型)不确定性的影响,导致计算结果的离散性较大,计算精度和适用性有限。鉴于此,该文首先结合修正压力场理论和考虑剪跨比影响的临界斜裂缝倾角模型,建立了RC梁的确定性抗剪承载力模型;然后综合考虑主观不确定性和客观不确定性因素的影响,结合贝叶斯理论和马尔科夫链蒙特卡洛法(MCMC),建立了RC梁抗剪承载力计算的概率模型;最后通过与试验数据和传统确定性计算模型的对比分析,验证了该模型的有效性和适用性。分析结果表明,所建立的概率模型不仅可以合理地描述RC梁抗剪承载力的概率分布特性,而且可以校准传统确定性计算模型的计算精度和置信水平,还可以根据预定的置信水平确定RC梁抗剪承载力的概率特征值,具有良好的计算精度和适用性。  相似文献   

9.
Optimization leads to specialized structures which are not robust to disturbance events like unanticipated abnormal loading or human errors. Typical reliability-based and robust optimization mainly address objective aleatory uncertainties. To date, the impact of subjective epistemic uncertainties in optimal design has not been comprehensively investigated. In this paper, we use an independent parameter to investigate the effects of epistemic uncertainties in optimal design: the latent failure probability. Reliability-based and risk-based truss topology optimization are addressed. It is shown that optimal risk-based designs can be divided in three groups: (A) when epistemic uncertainty is small (in comparison to aleatory uncertainty), the optimal design is indifferent to it and yields isostatic structures; (B) when aleatory and epistemic uncertainties are relevant, optimal design is controlled by epistemic uncertainty and yields hyperstatic but nonredundant structures, for which expected costs of direct collapse are controlled; (C) when epistemic uncertainty becomes too large, the optimal design becomes redundant, as a way to control increasing expected costs of collapse. The three regions above are divided by hyperstatic and redundancy thresholds. The redundancy threshold is the point where the structure needs to become redundant so that its reliability becomes larger than the latent reliability of the simplest isostatic system. Simple truss topology optimization is considered herein, but the conclusions have immediate relevance to the optimal design of realistic structures subject to aleatory and epistemic uncertainties.  相似文献   

10.
The traditional reliability analysis method based on probabilistic method requires probability distributions of all the uncertain parameters. However, in practical applications, the distributions of some parameters may not be precisely known due to the lack of sufficient sample data. The probabilistic theory cannot directly measure the reliability of structures with epistemic uncertainty, ie, subjective randomness and fuzziness. Hence, a hybrid reliability analysis (HRA) problem will be caused when the aleatory and epistemic uncertainties coexist in a structure. In this paper, by combining the probability theory and the uncertainty theory into a chance theory, a probability‐uncertainty hybrid model is established, and a new quantification method based on the uncertain random variables for the structural reliability is presented in order to simultaneously satisfy the duality of random variables and the subadditivity of uncertain variables; then, a reliability index is explored based on the chance expected value and variance. Besides, the formulas of the chance theory‐based reliability and reliability index are derived to uniformly assess the reliability of structures under the hybrid aleatory and epistemic uncertainties. The numerical experiments illustrate the validity of the proposed method, and the results of the proposed method can provide a more accurate assessment of the structural system under the mixed uncertainties than the ones obtained separately from the probability theory and the uncertainty theory.  相似文献   

11.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

12.
提出了基于贝叶斯理论的地震风险评估方法,综合考虑了地震危险性模型、输入地震动记录、结构参数和需求模型的不确定性,并以云南大理地区1970年-2017年间的地震数据为研究基础进行了详细讨论。在传统基于概率地震危险性分析方法的基础上,提出了基于贝叶斯理论的地震危险性分析方法,通过贝叶斯更新准则,确定了地震概率模型中未知参数的后验概率分布;通过贝叶斯理论建立了基于概率的地震需求模型,并在易损性中考虑了需求模型认知不确定性的影响;以42层钢框架-RC核心筒建筑为例,开展了地震作用下的风险评估。研究表明:基于贝叶斯理论的地震危险性分析方法,能够获得更为合理的危险性模型;忽略需求模型中参数不确定性的影响,将错误估计结构的地震易损性;不同加载工况将对高层建筑的地震风险产生显著影响。提出的概率风险评估方法,提供了可以考虑固有不确定性和认知不确定性的有效途径,有助于推动高性能结构地震韧性评价和设计理论的发展。  相似文献   

13.
Error and uncertainty in modeling and simulation   总被引:1,自引:0,他引:1  
This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.  相似文献   

14.
Fault tree analysis is a method largely used in probabilistic risk assessment. Uncertainties should be properly handled in fault tree analyses to support a robust decision making. While many sources of uncertainties are considered, dependence uncertainties are not much explored. Such uncertainties can be labeled as ‘epistemic’ because of the way dependence is modeled. In practice, despite probability theory, alternative mathematical structures, including possibility theory and fuzzy set theory, for the representation of epistemic uncertainty can be used. In this article, a fuzzy β factor is considered to represent the failure dependence uncertainties among basic events. The relationship between β factor and system failure probability is analyzed to support the use of a hybrid probabilistic–possibilistic approach. As a result, a complete hybrid probabilistic–possibilistic framework is constructed. A case study of a high integrity pressure protection system is discussed. The results show that the proposed method provides decision makers a more accurate understanding of the system under analysis when failure dependencies are involved. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

15.
Reliability analysis with both aleatory and epistemic uncertainties is investigated in this paper. The aleatory uncertainties are described with random variables, and epistemic uncertainties are tackled with evidence theory. To estimate the bounds of failure probability, several methods have been proposed. However, the existing methods suffer the dimensionality challenge of epistemic variables. To get rid of this challenge, a so‐called random‐set based Monte Carlo simulation (RS‐MCS) method derived from the theory of random sets is offered. Nevertheless, RS‐MCS is also computational expensive. So an active learning Kriging (ALK) model that only rightly predicts the sign of performance function is introduced and closely integrated with RS‐MCS. The proposed method is termed as ALK‐RS‐MCS. ALK‐RS‐MCS accurately predicts the bounds of failure probability using as few function calls as possible. Moreover, in ALK‐RS‐MCS, an optimization method based on Karush–Kuhn–Tucker conditions is proposed to make the estimation of failure probability interval more efficient based on the Kriging model. The efficiency and accuracy of the proposed approach are demonstrated with four examples. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

16.
Quantification of margins and uncertainties (QMU) was originally introduced as a framework for assessing confidence in nuclear weapons, and has since been extended to more general complex systems. We show that when uncertainties are strictly bounded, QMU is equivalent to a graphical model, provided confidence is identified with reliability one. In the more realistic case that uncertainties have long tails, we find that QMU confidence is not always a good proxy for reliability, as computed from the graphical model. We explore the possibility of defining QMU in terms of the graphical model, rather than through the original procedures. The new formalism, which we call probabilistic QMU, or pQMU, is fully probabilistic and mathematically consistent, and shows how QMU may be interpreted within the framework of system reliability theory.  相似文献   

17.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

18.
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results.  相似文献   

19.
余波  陶伯雄  刘圣宾 《工程力学》2018,35(9):135-144
该文首先基于多轴受力情况下混凝土材料的极限强度面和Willam-Warnke五参数破坏准则,结合144组箍筋约束混凝土棱柱体的试验数据,建立了箍筋约束混凝土峰值应力的确定性模型;然后综合考虑主观不确定性和客观不确定性的影响,结合贝叶斯理论和马尔科夫链蒙特卡洛法,建立了箍筋约束混凝土峰值应力的概率模型;最后通过与试验数据和传统确定性模型的对比分析,验证了该概率模型的有效性和适用性。分析结果表明,该概率模型不仅能够合理描述箍筋约束混凝土峰值应力的概率特性,而且能够校准确定性模型的置信水平和预测精度,还可以确定具有预定置信水平的箍筋约束混凝土峰值应力的概率特征值。  相似文献   

20.
In order to overcome the disadvantages of traditional deterministic models, a probabilistic bond strength model of reinforcement bar in concrete was presented. According to the partly cracked thick-walled cylinder model, a deterministic bond strength model of reinforcement bar in concrete was developed first by taking into account the influences of various important factors. Then the analytical expression of probabilistic bond strength model of reinforcement bar in concrete was derived by taking into consideration both aleatory and epistemic uncertainties. Subsequently, a probabilistic bond strength model of reinforcement bar in concrete was proposed by determining the statistical characteristics of probabilistic model parameters based on the Markov Chain Monte Carlo method and the Bayesian theory. Finally, applicability of the proposed probabilistic model were validated by comparing with 400 sets of experimental data and four typical deterministic bond strength models. Analysis shows that the probabilistic model provides efficient approaches to describe the probabilistic characteristics of bond strength and to calibrate traditional deterministic bond strength models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号