首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This paper deals with the use of Bayesian networks to compute system reliability. The reliability analysis problem is described and the usual methods for quantitative reliability analysis are presented within a case study. Some drawbacks that justify the use of Bayesian networks are identified. The basic concepts of the Bayesian networks application to reliability analysis are introduced and a model to compute the reliability for the case study is presented. Dempster Shafer theory to treat epistemic uncertainty in reliability analysis is then discussed and its basic concepts that can be applied thanks to the Bayesian network inference algorithm are introduced. Finally, it is shown, with a numerical example, how Bayesian networks’ inference algorithms compute complex system reliability and what the Dempster Shafer theory can provide to reliability analysis.  相似文献   

2.
Bayesian uncertainty analysis with applications to turbulence modeling   总被引:2,自引:0,他引:2  
In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.  相似文献   

3.
Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster–Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem.  相似文献   

4.
Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories.After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems.  相似文献   

5.
A probabilistic approach for representation of interval uncertainty   总被引:1,自引:0,他引:1  
In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.  相似文献   

6.
This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model.  相似文献   

7.
The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation.  相似文献   

8.
为支持电信业务的个性化和智能化,研究了业务上下文信息的表示和推理,提出了一种统一地表示业务上下文信息的语法结构、语义以及上下文元信息(如时间、可信度)的本体建模方法.并利用贝叶斯网络理论,提出了一种支持不确定性推理的业务上下文认知模型的构建方法,并通过仿真实验验证了模型和结论的合理性.  相似文献   

9.
Computational simulation methods have advanced to a point where simulation can contribute substantially in many areas of systems analysis. One research challenge that has accompanied this transition involves the characterization of uncertainty in both computer model inputs and the resulting system response. This article addresses a subset of the ‘challenge problems’ posed in [Challenge problems: uncertainty in system response given uncertain parameters, 2001] where uncertainty or information is specified over intervals of the input parameters and inferences based on the response are required. The emphasis of the article is to describe and illustrate a method for performing tasks associated with this type of modeling ‘economically’-requiring relatively few evaluations of the system to get a precise estimate of the response. This ‘response-modeling approach’ is used to approximate a probability distribution for the system response. The distribution is then used: (1) to make inferences concerning probabilities associated with response intervals and (2) to guide in determining further, informative, system evaluations to perform.  相似文献   

10.
The Epistemic Uncertainty Project of Sandia National Laboratories (NM, USA) proposed two challenge problems intended to assess the applicability and the relevant merits of modern mathematical theories of uncertainty in reliability engineering and risk analysis. This paper proposes a solution to Problem B: the response of a mechanical system with uncertain parameters. Random Set Theory is used to cope with both imprecision and dissonance affecting the available information. Imprecision results in an envelope of CDFs of the system response bounded by an upper CDF and a lower CDF. Different types of parameter discretizations are introduced. It is shown that: (i) when the system response presents extrema in the range of parameters considered, it is better to increase the fineness of the discretization than to invoke a global optimization tool; (ii) the response expectation differed by less than 0.5% when the number of function calls was increased 15.7 times; (iii) larger differences (4–5%) were obtained for the lower tails of the CDFs of the response. Further research is necessary to investigate (i) parameter discretizations aimed at increasing the accuracy of the CDFs (lower) tails; (ii) the role of correlation in combining information.  相似文献   

11.
A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level model validation under uncertainty, integrating several components: lower-level data, higher-level data, computational model, and latent variables. The method proposed in this paper uses latent variables to model two sets of relationships, namely, the computational model to system-level data, and lower-level data to system-level data. A Bayesian network with Markov chain Monte Carlo simulation is applied to represent the two relationships and to estimate the influencing factors between them. Bayesian hypothesis testing is employed to quantify the confidence in the predictive model at the system level, and the role of lower-level data in the model validation assessment at the system level. The proposed methodology is implemented for hierarchical assessment of three validation problems, using discrete observations and time-series data.  相似文献   

12.
An evidence-based approach is developed for optimization of structural components under material parameter uncertainty. The approach is applied to evidence-based design optimization (EBDO) of externally stiffened circular tubes under axial impact load using an isotropic–elastic–plastic plasticity model to simulate dynamic material behaviour. Uncertainty modelling considers the changes in material parameters that are caused by variability in material properties as well as incertitude and errors in experimental data and procedure to determine the material parameters. Spatial variation of material parameters across the structural component is modelled using a field joint belief structure and propagated for the calculation of evidence-based objective function and design constraints. Surrogate models are used in both uncertainty propagation and solution of the optimization problem. The methodology and the solution to the EBDO example problem are presented and discussed.  相似文献   

13.
    
Three applications of sampling-based sensitivity analysis in conjunction with evidence theory representations for epistemic uncertainty in model inputs are described: (i) an initial exploratory analysis to assess model behavior and provide insights for additional analysis; (ii) a stepwise analysis showing the incremental effects of uncertain variables on complementary cumulative belief functions and complementary cumulative plausibility functions; and (iii) a summary analysis showing a spectrum of variance-based sensitivity analysis results that derive from probability spaces that are consistent with the evidence space under consideration.  相似文献   

14.
Bayesian risk-based decision method for model validation under uncertainty   总被引:2,自引:0,他引:2  
This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment.  相似文献   

15.
基于贝叶斯理论的测量不确定度A类评定   总被引:4,自引:0,他引:4  
文章介绍了贝叶斯理论,并利用此理论对测量不确定度A类评定进行分析,与基于经典统计方法的不确定度A类评定相比,该方法能充分利用历史测量数据所提供的信息,因此评定时信息量大,使评定更加合理。最后通过实例分析说明了基于贝叶斯理论的不确定度A类评定的合理性和优越性。  相似文献   

16.
Uncertainty quantification and risk assessment in the optimal design of structural systems has always been a critical consideration for engineers. When new technologies are developed or implemented and budgets are limited for full-scale testing, the result is insufficient datasets for construction of probability distributions. Making assumptions about these probability distributions can potentially introduce more uncertainty to the system than it quantifies. Evidence theory represents a method to handle epistemic uncertainty that represents a lack of knowledge or information in the numerical optimization process. Therefore, it is a natural tool to use for uncertainty quantification and risk assessment especially in the optimization design cycle for future aerospace structures where new technologies are being applied. For evidence theory to be recognized as a useful tool, it must be efficiently applied in a robust design optimization scheme. This article demonstrates a new method for projecting the reliability gradient, based on the measures of belief and plausibility, without gathering any excess information other than what is required to determine these measures. This represents a huge saving in computational time over other methods available in the current literature. The technique developed in this article is demonstrated with three optimization examples.  相似文献   

17.
Ranking a group of candidate sites and selecting from it the high-risk locations or hotspots for detailed engineering study and countermeasure evaluation is the first step in a transport safety improvement program. Past studies have however mainly focused on the task of applying appropriate methods for ranking locations, with few focusing on the issue of how to define selection methods or threshold rules for hotspot identification. The primary goal of this paper is to introduce a multiple testing-based approach to the problem of selecting hotspots. Following the recent developments in the literature, two testing procedures are studied under a Bayesian framework: Bayesian test with weights (BTW) and a Bayesian test controlling for the posterior false discovery rate (FDR) or false negative rate (FNR). The hypotheses tests are implemented on the basis of two random effect or Bayesian models, namely, the hierarchical Poisson/Gamma or Negative Binomial model and the hierarchical Poisson/Lognormal model. A dataset of highway–railway grade crossings is used as an application example to illustrate the proposed procedures incorporating both the posterior distribution of accident frequency and the posterior distribution of ranks. Results on the effects of various decision parameters used in hotspot identification procedures are discussed.  相似文献   

18.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

19.
The paper describes an approach to representing, aggregating and propagating aleatory and epistemic uncertainty through computational models. The framework for the approach employs the theory of imprecise coherent probabilities. The approach is exemplified by a simple algebraic system, the inputs of which are uncertain. Six different uncertainty situations are considered, including mixtures of epistemic and aleatory uncertainty.  相似文献   

20.
  总被引:1,自引:0,他引:1  
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号