首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 765 毫秒
1.
This paper develops a methodology to assess the validity of computational models when some quantities may be affected by epistemic uncertainty. Three types of epistemic uncertainty regarding input random variables - interval data, sparse point data, and probability distributions with parameter uncertainty - are considered. When the model inputs are described using sparse point data and/or interval data, a likelihood-based methodology is used to represent these variables as probability distributions. Two approaches - a parametric approach and a non-parametric approach - are pursued for this purpose. While the parametric approach leads to a family of distributions due to distribution parameter uncertainty, the principles of conditional probability and total probability can be used to integrate the family of distributions into a single distribution. The non-parametric approach directly yields a single probability distribution. The probabilistic model predictions are compared against experimental observations, which may again be point data or interval data. A generalized likelihood function is constructed for Bayesian updating, and the posterior distribution of the model output is estimated. The Bayes factor metric is extended to assess the validity of the model under both aleatory and epistemic uncertainty and to estimate the confidence in the model prediction. The proposed method is illustrated using a numerical example.  相似文献   

2.
The design of a control chart requires the specification of three decision variables, namely the sample size, n, the sampling interval, h, and the action limit under which the process must be stopped for potential repair. In this paper, the Bayesian attribute control chart, namely the np chart for short run production, using a variable sample size is discussed. In a simulated experiment, optimal solutions of the static np chart, the basic Bayesian np chart, and the Bayesian scheme with adaptive sample size are presented. Results of the empirical study show that varying the sample size leads to more cost savings compared with the other two approaches. In order to detect how the input parameters affect decision variables, a regression analysis is conducted. It is obtained that the benefits of using the basic Bayesian np chart and the Bayesian chart with adaptive sample size instead of the static scheme are affected by the length of the production run. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

3.
An uncertainty-based sensitivity index represents the contribution that uncertainty in model input Xi makes to the uncertainty in model output Y. This paper addresses the situation where the uncertainties in the model inputs are expressed as closed convex sets of probability measures, a situation that exists when inputs are expressed as intervals or sets of intervals with no particular distribution specified over the intervals, or as probability distributions with interval-valued parameters. Three different approaches to measuring uncertainty, and hence uncertainty-based sensitivity, are explored. Variance-based sensitivity analysis (VBSA) estimates the contribution that each uncertain input, acting individually or in combination, makes to variance in the model output. The partial expected value of perfect information (partial EVPI), quantifies the (financial) value of learning the true numeric value of an input. For both of these sensitivity indices the generalization to closed convex sets of probability measures yields lower and upper sensitivity indices. Finally, the use of relative entropy as an uncertainty-based sensitivity index is introduced and extended to the imprecise setting, drawing upon recent work on entropy measures for imprecise information.  相似文献   

4.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

5.
Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster–Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem.  相似文献   

6.
Random vibration analysis aims to estimate the response statistics of dynamical systems subject to stochastic excitations. Stochastic differential equations (SDEs) that govern the response of general nonlinear systems are often complicated, and their analytical solutions are scarce. Thus, a range of approximate methods and simulation techniques have been developed. This paper develops a hybrid approach that approximates the governing SDE of nonlinear systems using a small number of response simulations and information available a priori. The main idea is to identify a set of surrogate linear systems such that their response probability distributions collectively estimate the response probability distribution of the original nonlinear system. To identify the surrogate linear systems, the proposed method integrates the simulated responses of the original nonlinear system with information available a priori about the number and parameters of the surrogate linear systems. There will be epistemic uncertainty in the number and parameters of the surrogate linear systems because of the limited data. This paper proposes a Bayesian nonparametric approach, called a Dirichlet Process Mixture Model, to capture these uncertainties. The Dirichlet process models the uncertainty over an infinite-dimensional parameter space, representing an infinite number of potential surrogate linear systems. Specifically, the proposed method allows the number of surrogate linear systems to grow indefinitely as the nonlinear system observed dynamic unveil new patterns. The quantified uncertainty in the estimates of the unknown model parameters propagates into the response probability distribution. The paper then shows that, under some mild conditions, the estimated probability distribution approaches, as close as desired, to the original nonlinear system’s response probability distribution. As a measure of model accuracy, the paper provides the convergence rate of the response probability distribution. Because the posterior distribution of the unknown model parameters is often not analytically tractable, a Gibbs sampling algorithm is presented to draw samples from the posterior distribution. Variational Bayesian inference is also introduced to derive an approximate closed-form expression for the posterior distribution. The paper illustrates the proposed method through the random vibration analysis of a nonlinear elastic and a nonlinear hysteretic system.  相似文献   

7.
Stochastic simulation is a commonly used tool by practitioners for evaluating the performance of inventory policies. A typical inventory simulation starts with the determination of the best-fit input models (e.g. probability distribution function of the demand random variable) and then obtains a performance measure estimate under these input models. However, this sequential approach ignores the uncertainty around the input models, leading to inaccurate performance measures, especially when there is limited historical input data. In this paper, we take an alternative approach and propose a simulation replication algorithm that jointly estimates the input models and the performance measure, leading to a credible interval for the performance measure under input-model uncertainty. Our approach builds on a nonparametric Bayesian input model and frees the inventory manager from making any restrictive assumptions on the functional form of the input models. Focusing on a single-product inventory simulation, we show that the proposed method improves the estimation of the service levels when compared to the traditional practice of using the best-fit or the empirical distribution as the unknown demand distribution.  相似文献   

8.
Three applications of sampling-based sensitivity analysis in conjunction with evidence theory representations for epistemic uncertainty in model inputs are described: (i) an initial exploratory analysis to assess model behavior and provide insights for additional analysis; (ii) a stepwise analysis showing the incremental effects of uncertain variables on complementary cumulative belief functions and complementary cumulative plausibility functions; and (iii) a summary analysis showing a spectrum of variance-based sensitivity analysis results that derive from probability spaces that are consistent with the evidence space under consideration.  相似文献   

9.
The Bayesian framework for statistical inference offers the possibility of taking expert opinions into account, and is therefore attractive in practical problems concerning the reliability of technical systems. Probability is the only language in which uncertainty can be consistently expressed, and this requires the use of prior distributions for reporting expert opinions. In this paper an extension of the standard Bayesian approach based on the theory of imprecise probabilities and intervals of measures is developed. It is shown that this is necessary to take the nature of experts' knowledge into account. The application of this approach in reliability theory is outlined. The concept of imprecise probabilities allows us to accept a range of possible probabilities from an expert for events of interest and thus makes the elicitation of prior information simpler and clearer. The method also provides a consistent way for combining the opinions of several experts.  相似文献   

10.
The paper presents an application of the generalised likelihood uncertainty estimation methodology to the problem of estimating the uncertainty of predictions produced by environmental models. The methodology is placed in a wider context of different approaches to inverse modelling and, in particular, a comparison is made with Bayesian estimation techniques based on explicit structural assumptions about model error. Using a simple example of a rainfall-flow model, different evaluation measures and their influence on the prediction uncertainty and credibility intervals are demonstrated.  相似文献   

11.
Traditional approaches toward modeling the availability of a system often do not formally take into account uncertainty over the parameter values of the model. Such models are then frequently criticized because the observed reliability of a system does not match that predicted by the model. This paper extends a recently published segregated failures model so that, rather than providing a single figure for the availability of a system, uncertainty over model parameter values is incorporated and a predictive probability distribution is given. This predictive distribution is generated in a practical way by displaying the uncertainties and dependencies of the parameters of the model through a Bayesian network (BN). Permitting uncertainty in the reliability model then allows the user to determine whether the predicted reliability was incorrect due to inherent variability in the system under study, or due to the use of an inappropriate model. Furthermore, it is demonstrated how the predictive distribution can be used when reliability predictions are employed within a formal decision‐theoretic framework. Use of the model is illustrated with the example of a high‐availability computer system with multiple recovery procedures. An BN is produced to display the relations between parameters of the model in this case and to generate a predictive probability distribution of the system's availability. This predictive distribution is then used to make two decisions under uncertainty concerning the offered warranty policies on the system: a qualitative decision and an optimization over a continuous decision space. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

12.
13.
《Composites Part B》2007,38(5-6):651-673
Current design approaches for seismic retrofit use deterministic variables to describe the geometry, material properties and the applied loads on the bridge column. Using a mechanistic model that considers nonlinear material behavior, these deterministic input variables can be directly mapped to the design parameters. However the results often give a false sense of reliability due to neglecting uncertainties related to the input variables of the analysis (data uncertainty), unpredictable fluctuations of loads and natural variability of material properties, and/or the uncertainty in the analytical model itself (model uncertainty). While methods of reliability analysis can provide a means for designing so as not to exceed specific levels of “acceptable” risk, they do not consider the uncertainty in the assumption of distribution functions for each of the input variables and are built on the basic assumption that the models used perfectly describe reality. This, however, still results in significant unknowns and often design models that are not truly validated across their response space. This paper describes the application of a fuzzy probabilistic approach to capture the inherent uncertainty in such applications. The application of the approach is demonstrated through an example and results are compared to those obtained from conventional deterministic analytical models. It is noted that the confidence in the achieved safety of the retrofit system that is based on the use of the fuzzy probabilistic approach is much higher than that achieved using the deterministic approach. This is due to the consideration of uncertainty in the material parameters as well as the consideration of uncertainty in the assumed crack angle during the design process.  相似文献   

14.
The scenario in a risk analysis can be defined as the propagating feature of specific initiating event which can go to a wide range of undesirable consequences. If we take various scenarios into consideration, the risk analysis becomes more complex than do without them. A lot of risk analyses have been performed to actually estimate a risk profile under both uncertain future states of hazard sources and undesirable scenarios. Unfortunately, in case of considering specific systems such as a radioactive waste disposal facility, since the behaviour of future scenarios is hardly predicted without special reasoning process, we cannot estimate their risk only with a traditional risk analysis methodology. Moreover, we believe that the sources of uncertainty at future states can be reduced pertinently by setting up dependency relationships interrelating geological, hydrological, and ecological aspects of the site with all the scenarios. It is then required current methodology of uncertainty analysis of the waste disposal facility be revisited under this belief.In order to consider the effects predicting from an evolution of environmental conditions of waste disposal facilities, this paper proposes a quantitative assessment framework integrating the inference process of Bayesian network to the traditional probabilistic risk analysis. We developed and verified an approximate probabilistic inference program for the specific Bayesian network using a bounded-variance likelihood weighting algorithm. Ultimately, specific models, including a model for uncertainty propagation of relevant parameters were developed with a comparison of variable-specific effects due to the occurrence of diverse altered evolution scenarios (AESs). After providing supporting information to get a variety of quantitative expectations about the dependency relationship between domain variables and AESs, we could connect the results of probabilistic inference from the Bayesian network with the consequence evaluation model addressed. We got a number of practical results to improve current knowledge base for the prioritization of future risk-dominant variables in an actual site.  相似文献   

15.
We formulate and evaluate a Bayesian approach to probabilistic input modeling for simulation experiments that accounts for the parameter and stochastic uncertainties inherent in most simulations and that yields valid predictive inferences about outputs of interest. We use prior information to construct prior distributions on the parameters of the input processes driving the simulation. Using Bayes' rule, we combine this prior information with the likelihood function of sample data observed on the input processes to compute the posterior parameter distributions. In our Bayesian simulation replication algorithm, we estimate parameter uncertainty by independently sampling new values of the input-model parameters from their posterior distributions on selected simulation runs; and we estimate stochastic uncertainty by performing multiple (conditionally) independent runs with each set of parameter values. We formulate performance measures relevant to both Bayesian and frequentist input-modeling techniques, and we summarize an experimental performance evaluation demonstrating the advantages of the Bayesian approach.  相似文献   

16.
In the fields of statistics and computer science a wide variety of methodologies exist for solving the traditional classification problem. This study will compare the error rates for various methods under a variety of conditions when the explanatory variables are continuous. The methods under considerations are neural networks, classical discriminant analysis, and two different approaches to decision trees. Training and testing sets are utilized to estimate the error rates of these methods for different numbers of sample sizes, number of explanatory variables, and the number of classes in the dependent variable. These error rates will be used to draw generalized conclusions about the relative efficiencies of the techniques.  相似文献   

17.
18.
The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation.  相似文献   

19.
The paper introduces two fundamental approaches for reliability improvement and risk reduction by using nontrivial algebraic inequalities: (a) by proving an inequality derived or conjectured from a real system or process and (b) by creating meaningful interpretation of an existing nontrivial abstract inequality relevant to a real system or process. A formidable advantage of the algebraic inequalities can be found in their capacity to produce tight bounds related to reliability-critical design parameters in the absence of any knowledge about the variation of the controlling variables. The effectiveness of the first approach has been demonstrated by examples related to decision-making under deep uncertainty and examples related to ranking systems built on components whose reliabilities are unknown. To demonstrate the second approach, meaningful interpretation has been created for an inequality that is a special case of the Cauchy-Schwarz inequality. By varying the interpretation of the variables, the same inequality holds for elastic elements, resistors, and capacitors arranged in series and parallel. The paper also shows that meaningful interpretation of superadditive and subadditive inequalities can be used with success for optimizing various systems and processes. Meaningful interpretation of superadditive and subadditive inequalities has been used for maximizing the stored elastic strain energy at a specified total displacement and for optimizing the profit from an investment. Finally, meaningful interpretation of an algebraic inequality has been used for reducing uncertainty and the risk of incorrect prediction about the magnitude ranking of sequential random events.  相似文献   

20.
In robust design, uncertainty is commonly modelled with precise probability distributions. In reality, the distribution types and distribution parameters may not always be available owing to limited data. This research develops a robust design methodology to accommodate the mixture of both precise and imprecise random variables. By incorporating the Taguchi quality loss function and the minimax regret criterion, the methodology mitigates the effects of not only uncertain parameters but also uncertainties in the models of the uncertain parameters. Hydrokinetic turbine systems are a relatively new alternative energy technology, and both precise and imprecise random variables exist in the design of such systems. The developed methodology is applied to the robust design optimization of a hydrokinetic turbine system. The results demonstrate the effectiveness of the proposed methodology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号