首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 375 毫秒
1.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

2.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

3.
A probabilistic approach for representation of interval uncertainty   总被引:1,自引:0,他引:1  
In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.  相似文献   

4.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

5.
A case study for quantifying system reliability and uncertainty   总被引:1,自引:0,他引:1  
The ability to estimate system reliability with an appropriate measure of associated uncertainty is important for understanding its expected performance over time. Frequently, obtaining full-system data is prohibitively expensive, impractical, or not permissible. Hence, methodology which allows for the combination of different types of data at the component or subsystem levels can allow for improved estimation at the system level. We apply methodologies for aggregating uncertainty from component-level data to estimate system reliability and quantify its overall uncertainty. This paper provides a proof-of-concept that uncertainty quantification methods using Bayesian methodology can be constructed and applied to system reliability problems for a system with both series and parallel structures.  相似文献   

6.
Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.  相似文献   

7.
This paper develops a methodology to integrate reliability testing and computational reliability analysis for product development. The presence of information uncertainty such as statistical uncertainty and modeling error is incorporated. The integration of testing and computation leads to a more cost-efficient estimation of failure probability and life distribution than the tests-only approach currently followed by the industry. A Bayesian procedure is proposed to quantify the modeling uncertainty using random parameters, including the uncertainty in mechanical and statistical model selection and the uncertainty in distribution parameters. An adaptive method is developed to determine the number of tests needed to achieve a desired confidence level in the reliability estimates, by combining prior computational prediction and test data. Two kinds of tests — failure probability estimation and life estimation — are considered. The prior distribution and confidence interval of failure probability in both cases are estimated using computational reliability methods, and are updated using the results of tests performed during the product development phase.  相似文献   

8.
Redundancy and robustness of systems of events   总被引:5,自引:0,他引:5  
The article aims to add a new impetus to rational and objective probabilistic evaluation of redundancy and robustness, based on uncertainties of systems and subsystems of events. An attempt is made to demonstrate the relevance of intuitive comprehension of redundancy and robustness of engineering systems of events. An event-oriented system analysis of a number of random observable operational and failure modes, with adverse probability distributions in a lifetime, may provide a deeper understanding of systems operational abundance and endurance. The system uncertainty analysis is based on the concept of entropy as defined in information theory and applied to probability theory. The article relates reliability, uncertainty, redundancy and robustness of systems of events and their application is illustrated in numerical examples.  相似文献   

9.
The traditional reliability analysis method based on probabilistic method requires probability distributions of all the uncertain parameters. However, in practical applications, the distributions of some parameters may not be precisely known due to the lack of sufficient sample data. The probabilistic theory cannot directly measure the reliability of structures with epistemic uncertainty, ie, subjective randomness and fuzziness. Hence, a hybrid reliability analysis (HRA) problem will be caused when the aleatory and epistemic uncertainties coexist in a structure. In this paper, by combining the probability theory and the uncertainty theory into a chance theory, a probability‐uncertainty hybrid model is established, and a new quantification method based on the uncertain random variables for the structural reliability is presented in order to simultaneously satisfy the duality of random variables and the subadditivity of uncertain variables; then, a reliability index is explored based on the chance expected value and variance. Besides, the formulas of the chance theory‐based reliability and reliability index are derived to uniformly assess the reliability of structures under the hybrid aleatory and epistemic uncertainties. The numerical experiments illustrate the validity of the proposed method, and the results of the proposed method can provide a more accurate assessment of the structural system under the mixed uncertainties than the ones obtained separately from the probability theory and the uncertainty theory.  相似文献   

10.
The risk assessment community has begun to make a clear distinction between aleatory and epistemic uncertainty in theory and in practice. Aleatory uncertainty is also referred to in the literature as variability, irreducible uncertainty, inherent uncertainty, and stochastic uncertainty. Epistemic uncertainty is also termed reducible uncertainty, subjective uncertainty, and state-of-knowledge uncertainty. Methods to efficiently represent, aggregate, and propagate different types of uncertainty through computational models are clearly of vital importance. The most widely known and developed methods are available within the mathematics of probability theory, whether frequentist or subjectivist. Newer mathematical approaches, which extend or otherwise depart from probability theory, are also available, and are sometimes referred to as generalized information theory (GIT). For example, possibility theory, fuzzy set theory, and evidence theory are three components of GIT. To try to develop a better understanding of the relative advantages and disadvantages of traditional and newer methods and encourage a dialog between the risk assessment, reliability engineering, and GIT communities, a workshop was held. To focus discussion and debate at the workshop, a set of prototype problems, generally referred to as challenge problems, was constructed. The challenge problems concentrate on the representation, aggregation, and propagation of epistemic uncertainty and mixtures of epistemic and aleatory uncertainty through two simple model systems. This paper describes the challenge problems and gives numerical values for the different input parameters so that results from different investigators can be directly compared.  相似文献   

11.
Performance assessment of complex systems is ideally done through full system-level testing which is seldom available for high consequence systems. Further, a reality of engineering practice is that some features of system behavior are not known from experimental data, but from expert assessment, only. On the other hand, individual component data, which are part of the full system are more readily available. The lack of system level data and the complexity of the system lead to a need to build computational models of a system in a hierarchical or building block approach (from simple components to the full system). The models are then used for performance prediction in lieu of experiments, to estimate the confidence in the performance of these systems. Central to this are the need to quantify the uncertainties present in the system and to compare the system response to an expected performance measure. This is the basic idea behind Quantification of Margins and Uncertainties (QMU). QMU is applied in decision making—there are many uncertainties caused by inherent variability (aleatoric) in materials, configurations, environments, etc., and lack of information (epistemic) in models for deterministic and random variables that influence system behavior and performance. This paper proposes a methodology to quantify margins and uncertainty in the presence of both aleatoric and epistemic uncertainty. It presents a framework based on Bayes networks to use available data at multiple levels of complexity (i.e. components, subsystem, etc.) and demonstrates a method to incorporate epistemic uncertainty given in terms of intervals on a model parameter.  相似文献   

12.
There are difficulties with probability as a representation of uncertainty. However, we argue that there is an important distinction between principle and practice. In principle, probability is uniquely appropriate for the representation and quantification of all forms of uncertainty; it is in this sense that we claim that ‘probability is perfect’. In practice, people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distributions is a far from perfect process. We therefore argue that there is no need for alternative theories, but that any practical elicitation of expert knowledge must fully acknowledge imprecision in the resulting distribution.We outline a recently developed Bayesian technique that allows the imprecision in elicitation to be formulated explicitly, and apply it to some of the challenge problems.  相似文献   

13.
In reliability modelling it is conventional to build sophisticated models of the probabilistic behaviour of the component lifetimes in a system in order to deduce information about the probabilistic behaviour of the system lifetime. Decision modelling of the reliability programme requires a priori, therefore, an even more sophisticated set of models in order to capture the evidence the decision maker believes may be obtained from different types of data acquisition.Bayes linear analysis is a methodology that uses expectation rather than probability as the fundamental expression of uncertainty. By working only with expected values, a simpler level of modelling is needed as compared to full probability models.In this paper we shall consider the Bayes linear approach to the estimation of a mean time to failure MTTF of a component. The model built will take account of the variance in our estimate of the MTTF, based on a variety of sources of information.  相似文献   

14.
For single‐use non‐repairable systems, reliability is commonly estimated as a function of age and usage. For the effective management of individual systems or populations of systems, it is frequently important and necessary to predict the reliability in the future for age and usage values not yet observed. When predicting future system reliability, the age of the future system is easily predicted whereas future usage values will typically be unknown. In this paper we present the methodology for how to estimate both individual and population reliability summaries based on the currently known age and usage values. Projected usage values for future points in time can be obtained based on observed usage patterns or user‐specified patterns of usage rates. Individual system summaries can be used to answer the questions ‘For a given system of age A and usage U, what is its reliability with associated uncertainty?’ or ‘For a given system with known current age A and usage U, but unknown usage in the future, what is its reliability with associated uncertainty?’ The population summary of interest predicts the probability that a system randomly selected from the population of systems works. This summary takes into consideration the estimation of future usage, the estimated probability of individual systems working at their given ages and usage values, and the life cycle demographics of the population of interest. In this paper we discuss these questions for a given application. Published in 2010 by John Wiley & Sons, Ltd.  相似文献   

15.
Quantification of margins and uncertainties (QMU) was originally introduced as a framework for assessing confidence in nuclear weapons, and has since been extended to more general complex systems. We show that when uncertainties are strictly bounded, QMU is equivalent to a graphical model, provided confidence is identified with reliability one. In the more realistic case that uncertainties have long tails, we find that QMU confidence is not always a good proxy for reliability, as computed from the graphical model. We explore the possibility of defining QMU in terms of the graphical model, rather than through the original procedures. The new formalism, which we call probabilistic QMU, or pQMU, is fully probabilistic and mathematically consistent, and shows how QMU may be interpreted within the framework of system reliability theory.  相似文献   

16.
失效分析辅助专家系统中的不确定性推理   总被引:1,自引:0,他引:1  
分析了各种不确定性推理的方法和特点,并结合失效分析领域知识的表达方式,提出了失效分析辅助专家系统不确定性推理的方法,即在应用失效基础知识和逻辑推理进行辅助失效分析时,宜采用经典概率和加权可信度的不确定性推理方法;在应用案例知识进行类比推理时则宜采用基于框架的不确定性推理。给出了不确定性推理的具体实现过程。  相似文献   

17.
When the parameters required to model a rock mass are known, the successive step is the calculation of the rock mass response based on these values of the parameters. If the latter are not deterministic, their uncertainty must be extended to the predicted behavior of the rock mass. In this paper, Random Set Theory is used to address two basic questions: (a) is it possible to conduct a reliable reliability analysis of a complex system such as a rock mass when a complex numerical model must be used? (b) is it possible to conduct a reliable reliability analysis that takes into account the whole amount of uncertainty experienced in data collection (i.e. both randomness and imprecision)?

It is shown that, if data are only affected by randomness, the proposed procedures allow the results of a Monte Carlo simulation to be efficiently bracketed, drastically reducing the number of calculations required. This allows reliability analyses to be performed even when complex, non-linear numerical methods are adopted.

If not only randomness but also imprecision affects input data, upper and lower bounds on the probability of predicted rock mass response are calculated with ease. The importance of imprecision (usually disregarded) turns out to be decisive in the prediction of the behavior of the rock mass.

Applications are presented with reference to slope stability, the convergence-confinement method and the Distinct Element Method.  相似文献   


18.
This paper develops a methodology to assess the validity of computational models when some quantities may be affected by epistemic uncertainty. Three types of epistemic uncertainty regarding input random variables - interval data, sparse point data, and probability distributions with parameter uncertainty - are considered. When the model inputs are described using sparse point data and/or interval data, a likelihood-based methodology is used to represent these variables as probability distributions. Two approaches - a parametric approach and a non-parametric approach - are pursued for this purpose. While the parametric approach leads to a family of distributions due to distribution parameter uncertainty, the principles of conditional probability and total probability can be used to integrate the family of distributions into a single distribution. The non-parametric approach directly yields a single probability distribution. The probabilistic model predictions are compared against experimental observations, which may again be point data or interval data. A generalized likelihood function is constructed for Bayesian updating, and the posterior distribution of the model output is estimated. The Bayes factor metric is extended to assess the validity of the model under both aleatory and epistemic uncertainty and to estimate the confidence in the model prediction. The proposed method is illustrated using a numerical example.  相似文献   

19.
In this article, the authors present a general methodology for age‐dependent reliability analysis of degrading or ageing components, structures and systems. The methodology is based on Bayesian methods and inference—its ability to incorporate prior information and on ideas that ageing can be thought of as age‐dependent change of beliefs about reliability parameters (mainly failure rate), when change of belief occurs not only because new failure data or other information becomes available with time but also because it continuously changes due to the flow of time and the evolution of beliefs. The main objective of this article is to present a clear way of how practitioners can apply Bayesian methods to deal with risk and reliability analysis considering ageing phenomena. The methodology describes step‐by‐step failure rate analysis of ageing components: from the Bayesian model building to its verification and generalization with Bayesian model averaging, which as the authors suggest in this article, could serve as an alternative for various goodness‐of‐fit assessment tools and as a universal tool to cope with various sources of uncertainty. The proposed methodology is able to deal with sparse and rare failure events, as is the case in electrical components, piping systems and various other systems with high reliability. In a case study of electrical instrumentation and control components, the proposed methodology was applied to analyse age‐dependent failure rates together with the treatment of uncertainty due to age‐dependent model selection. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

20.
High temperature design methods rely on constitutive models for inelastic deformation and failure typically calibrated against the mean of experimental data without considering the associated scatter. Variability may arise from the experimental data acquisition process, from heat-to-heat material property variations, or both and need to be accurately captured to predict parameter bounds leading to efficient component design. Applying the Bayesian Markov Chain Monte Carlo (MCMC) method to produce statistical models capturing the underlying uncertainty in the experimental data is an area of ongoing research interest. This work varies aspects of the Bayesian MCMC method and explores their effect on the posterior parameter distributions for a uniaxial elasto-viscoplastic damage model using synthetically generated reference data. From our analysis with the uniaxial inelastic model we determine that an informed prior distribution including different types of test conditions results in more accurate posterior parameter distributions. The parameter posterior distributions, however, do not improve when increasing the number of similar experimental data. Additionally, changing the amount of scatter in the data affects the quality of the posterior distributions, especially for the less sensitive model parameters. Moreover, we perform a sensitivity study of the model parameters against the likelihood function prior to the Bayesian analysis. The results of the sensitivity analysis help to determine the reliability of the posterior distributions and reduce the dimensionality of the problem by fixing the insensitive parameters. The comprehensive study described in this work demonstrates how to efficiently apply the Bayesian MCMC methodology to capture parameter uncertainties in high temperature inelastic material models. Quantifying these uncertainties in inelastic models will improve high temperature engineering design practices and lead to safer, more effective component designs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号