共查询到20条相似文献,搜索用时 15 毫秒
1.
William L. Oberkampf Jon C. Helton Cliff A. Joslyn Steven F. Wojtkiewicz Scott Ferson 《Reliability Engineering & System Safety》2004,85(1-3):11
The risk assessment community has begun to make a clear distinction between aleatory and epistemic uncertainty in theory and in practice. Aleatory uncertainty is also referred to in the literature as variability, irreducible uncertainty, inherent uncertainty, and stochastic uncertainty. Epistemic uncertainty is also termed reducible uncertainty, subjective uncertainty, and state-of-knowledge uncertainty. Methods to efficiently represent, aggregate, and propagate different types of uncertainty through computational models are clearly of vital importance. The most widely known and developed methods are available within the mathematics of probability theory, whether frequentist or subjectivist. Newer mathematical approaches, which extend or otherwise depart from probability theory, are also available, and are sometimes referred to as generalized information theory (GIT). For example, possibility theory, fuzzy set theory, and evidence theory are three components of GIT. To try to develop a better understanding of the relative advantages and disadvantages of traditional and newer methods and encourage a dialog between the risk assessment, reliability engineering, and GIT communities, a workshop was held. To focus discussion and debate at the workshop, a set of prototype problems, generally referred to as challenge problems, was constructed. The challenge problems concentrate on the representation, aggregation, and propagation of epistemic uncertainty and mixtures of epistemic and aleatory uncertainty through two simple model systems. This paper describes the challenge problems and gives numerical values for the different input parameters so that results from different investigators can be directly compared. 相似文献
2.
Quantification of epistemic and aleatory uncertainties in level-1 probabilistic safety assessment studies 总被引:2,自引:1,他引:2
K. Durga Rao H.S. Kushwaha A.K. Verma A. Srividya 《Reliability Engineering & System Safety》2007,92(7):947-956
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP. 相似文献
3.
Eduard Hofer Martina Kloos Bernard Krzykacz-Hausmann Jrg Peschke Martin Woltereck 《Reliability Engineering & System Safety》2002,77(3)
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any. 相似文献
4.
Martin Pilch Timothy G. TrucanoJon C. Helton 《Reliability Engineering & System Safety》2011,96(9):965-975
Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas. 相似文献
5.
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. 相似文献
6.
Scott Ferson Cliff A. Joslyn Jon C. Helton William L. Oberkampf Kari Sentz 《Reliability Engineering & System Safety》2004,85(1-3):355
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results. 相似文献
7.
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000 yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of subjective uncertainty is discussed, including assignment of distributions, uncertain variables selected for inclusion in analysis, correlation control, sample size, statistical confidence on mean complementary cumulative distribution functions, generation of Latin hypercube samples, sensitivity analysis techniques, and scenarios involving stochastic and subjective uncertainty. 相似文献
8.
Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems 总被引:10,自引:0,他引:10
The following techniques for uncertainty and sensitivity analysis are briefly summarized: Monte Carlo analysis, differential analysis, response surface methodology, Fourier amplitude sensitivity test, Sobol' variance decomposition, and fast probability integration. Desirable features of Monte Carlo analysis in conjunction with Latin hypercube sampling are described in discussions of the following topics: (i) properties of random, stratified and Latin hypercube sampling, (ii) comparisons of random and Latin hypercube sampling, (iii) operations involving Latin hypercube sampling (i.e. correlation control, reweighting of samples to incorporate changed distributions, replicated sampling to test reproducibility of results), (iv) uncertainty analysis (i.e. cumulative distribution functions, complementary cumulative distribution functions, box plots), (v) sensitivity analysis (i.e. scatterplots, regression analysis, correlation analysis, rank transformations, searches for nonrandom patterns), and (vi) analyses involving stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty. 相似文献
9.
10.
In order to overcome the disadvantages of traditional deterministic models, a probabilistic bond strength model of reinforcement bar in concrete was presented. According to the partly cracked thick-walled cylinder model, a deterministic bond strength model of reinforcement bar in concrete was developed first by taking into account the influences of various important factors. Then the analytical expression of probabilistic bond strength model of reinforcement bar in concrete was derived by taking into consideration both aleatory and epistemic uncertainties. Subsequently, a probabilistic bond strength model of reinforcement bar in concrete was proposed by determining the statistical characteristics of probabilistic model parameters based on the Markov Chain Monte Carlo method and the Bayesian theory. Finally, applicability of the proposed probabilistic model were validated by comparing with 400 sets of experimental data and four typical deterministic bond strength models. Analysis shows that the probabilistic model provides efficient approaches to describe the probabilistic characteristics of bond strength and to calibrate traditional deterministic bond strength models. 相似文献
11.
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000-yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of stochastic uncertainty is discussed, including drilling intrusion time, drilling location, penetration of excavated/nonexcavated areas of the repository, penetration of pressurized brine beneath the repository, borehole plugging patterns, activity level of waste, and occurrence of potash mining. Additional topics discussed include sampling procedures, generation of individual 10,000-yr futures for the WIPP, construction of complementary cumulative distribution functions (CCDFs), mechanistic calculations carried out to support CCDF construction, the Kaplan/Garrick ordered triple representation for risk, and determination of scenarios and scenario probabilities. 相似文献
12.
Alina Alexeenko Sruti ChigullapalliJuan Zeng Xiaohui GuoAndrew Kovacs Dimitrios Peroulis 《Reliability Engineering & System Safety》2011,96(9):1171-1183
Effects of uncertainties in gas damping models, geometry and mechanical properties on the dynamics of micro-electro-mechanical systems (MEMS) capacitive switch are studied. A sample of typical capacitive switches has been fabricated and characterized at Purdue University. High-fidelity simulations of gas damping on planar microbeams are developed and verified under relevant conditions. This and other gas damping models are then applied to study the dynamics of a single closing event for switches with experimentally measured properties. It has been demonstrated that although all damping models considered predict similar damping quality factor and agree well for predictions of closing time, the models differ by a factor of two and more in predicting the impact velocity and acceleration at contact. Implications of parameter uncertainties on the key reliability-related parameters such as the pull-in voltage, closing time and impact velocity are discussed. A notable effect of uncertainty is that the nominal switch, i.e. the switch with the average properties, does not actuate at the mean actuation voltage. Additionally, the device-to-device variability leads to significant differences in dynamics. For example, the mean impact velocity for switches actuated under the 90%-actuation voltage (about 150 V), i.e. the voltage required to actuate 90% of the sample, is about 129 cm/s and increases to 173 cm/s for the 99%-actuation voltage (of about 173 V). Response surfaces of impact velocity and closing time to five input variables were constructed using the Smolyak sparse grid algorithm. The sensitivity analysis showed that impact velocity is most sensitive to the damping coefficient whereas the closing time is most affected by the geometric parameters such as gap and beam thickness. 相似文献
13.
14.
Jon C. Helton 《Reliability Engineering & System Safety》2011,96(9):976-1013
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty. 相似文献
15.
J. C. Helton D. R. Anderson G. Basabilvazo H. -N. Jow M. G. Marietta 《Reliability Engineering & System Safety》2000,69(1-3)
The Waste Isolation Pilot Plant (WIPP) is under development by the US Department of Energy (DOE) for the geologic disposal of transuranic waste. The construction of complementary cumulative distribution functions (CCDFs) for total radionuclide release from the WIPP to the accessible environment is described. The resultant CCDFs (i) combine releases due to cuttings and cavings, spallings, direct brine release, and long-term transport in flowing groundwater; (ii) fall substantially to the left of the boundary line specified by the US Environmental Protection Agency's (EPA's) standard 40 CFR 191 for the geologic disposal of radioactive waste; and (iii) constitute an important component of the DOE's successful Compliance Certification Application to the EPA for the WIPP. Insights and perspectives gained in the performance assessment (PA) that led to these CCDFs are described, including the importance of: (i) an iterative approach to PA; (ii) uncertainty and sensitivity analysis; (iii) a clear conceptual model for the analysis; (iv) the separation of stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty; (v) quality assurance procedures; (vi) early involvement of peer reviewers, regulators, and stakeholders; (vii) avoidance of conservative assumptions; and (viii) adequate documentation. 相似文献
16.
Jon C. Helton Jay D. JohnsonCédric J. Sallaberry 《Reliability Engineering & System Safety》2011,96(9):1014-1033
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses. 相似文献
17.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses. 相似文献
18.
Mixed aleatory-epistemic uncertainty quantification with stochastic expansions and optimization-based interval estimation 总被引:1,自引:0,他引:1
Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster-Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable. 相似文献
19.
J. M. Dickey 《TEST》1980,31(1):471-487
Summary Parameterized families of subjective probability distributions can be used to great advantage to model beliefs of experts,
especially when such models include dependence on concomitant variables. In one such model, probabilities of simple events
can be expressed in loglinear form. In another, a generalization of the multivariatet distribution has concomitant variables entering linearly through the location vector. Interactive interview methods for assessing
this second model and matrix extensions thereof were given in recent joint work of the author with A.P. Dawid, J.B. Kadane
and others. In any such verbal assessment method, elicited quantiles must be fitted by subjective probability models. The
fitting requires the use of a further probability model for errors of elicitation. This paper gives new theory relating the
form of the distribution of elicited probabilities and elicited quantiles to the form of the subjective probability distribution.
The first and second order moment structures are developed to permit generalized least squares fits.
Present affiliation: State University of New York, Albany 相似文献
20.
There are difficulties with probability as a representation of uncertainty. However, we argue that there is an important distinction between principle and practice. In principle, probability is uniquely appropriate for the representation and quantification of all forms of uncertainty; it is in this sense that we claim that ‘probability is perfect’. In practice, people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distributions is a far from perfect process. We therefore argue that there is no need for alternative theories, but that any practical elicitation of expert knowledge must fully acknowledge imprecision in the resulting distribution.We outline a recently developed Bayesian technique that allows the imprecision in elicitation to be formulated explicitly, and apply it to some of the challenge problems. 相似文献