首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Computational simulation methods have advanced to a point where simulation can contribute substantially in many areas of systems analysis. One research challenge that has accompanied this transition involves the characterization of uncertainty in both computer model inputs and the resulting system response. This article addresses a subset of the ‘challenge problems’ posed in [Challenge problems: uncertainty in system response given uncertain parameters, 2001] where uncertainty or information is specified over intervals of the input parameters and inferences based on the response are required. The emphasis of the article is to describe and illustrate a method for performing tasks associated with this type of modeling ‘economically’-requiring relatively few evaluations of the system to get a precise estimate of the response. This ‘response-modeling approach’ is used to approximate a probability distribution for the system response. The distribution is then used: (1) to make inferences concerning probabilities associated with response intervals and (2) to guide in determining further, informative, system evaluations to perform.  相似文献   

3.
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results.  相似文献   

4.
We study the two-parameter maximum likelihood estimation (MLE) problem for the Weibull distribution with consideration of interval data. Without interval data, the problem can be solved easily by regular MLE methods because the restricted MLE of the scale parameter β for a given shape parameter α has an analytical form, thus α can be efficiently solved from its profile score function by traditional numerical methods. In the presence of interval data, however, the analytical form for the restricted MLE of β does not exist and directly applying regular MLE methods could be less efficient and effective. To improve efficiency and effectiveness in handling interval data in the MLE problem, a new approach is developed in this paper. The new approach combines the Weibull-to-exponential transformation technique and the equivalent failure and lifetime technique. The concept of equivalence is developed to estimate exponential failure rates from uncertain data including interval data. Since the definition of equivalent failures and lifetimes follows EM algorithms, convergence of failure rate estimation by applying equivalent failures and lifetimes is mathematically proved. The new approach is demonstrated and validated through two published examples, and its performance in different conditions is studied by Monte Carlo simulations. It indicates that the profile score function for α has only one maximum in most cases. Such good characteristic enables efficient search for the optimal value of α.  相似文献   

5.
Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster-Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.  相似文献   

6.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

7.
Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster–Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem.  相似文献   

8.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

9.
In robust design, it is common to estimate empirical models that relate an output response variable to controllable input variables and uncontrollable noise variables from experimental data. However, when determining the optimal input settings that minimise output variability, parameter uncertainties in noise factors and response models are typically neglected. This article presents an interval robust design approach that takes parameter uncertainties into account through the confidence regions for these unknown parameters. To avoid obtaining an overly conservative design, the worst and best cases of mean squared error are both adopted to build an optimisation approach. The midpoint and radius of the interval are used to measure the location and dispersion performances, respectively. Meanwhile, a data-driven method is applied to obtain the relative weights of the location and dispersion performances in the optimisation approach. A simulation example and a case study using automobile manufacturing data from the dimensional tolerance design process are used to demonstrate the effectiveness of the proposed approach. The proposed approach of considering both uncertainties is shown to perform better than other approaches.  相似文献   

10.
This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions.  相似文献   

11.
This paper presents an effective univariate Chebyshev polynomials method (UCM) for interval bounds estimation of uncertain structures with unknown-but-bounded parameters. The interpolation points required by the conventional collocation methods to generate the surrogate model are the tensor product of each one-dimensional (1D) interpolating point. Therefore, the computational cost is expensive for uncertain structures containing more interval parameters. To deal with this issue, the univariate decomposition is derived through the higher-order Taylor expansion. The structural system is decomposed into a sum of several univariate subsystems, where each subsystem only involves one uncertain parameter and replaces the other parameters with their midpoint value. Then the Chebyshev polynomials are utilized to fit the subsystems, in which the coefficients of these subsystems are confirmed only using the linear combination of 1D interpolation points. Next, a surrogate model of the actual structural system composed of explicit univariate Chebyshev functions is established. Finally, the extremum of each univariate function that is obtained by the scanning method is substituted into the surrogate model to determine the interval ranges of the uncertain structures. Numerical analysis is conducted to validate the accuracy and effectiveness of the proposed method.  相似文献   

12.
The risk assessment community has begun to make a clear distinction between aleatory and epistemic uncertainty in theory and in practice. Aleatory uncertainty is also referred to in the literature as variability, irreducible uncertainty, inherent uncertainty, and stochastic uncertainty. Epistemic uncertainty is also termed reducible uncertainty, subjective uncertainty, and state-of-knowledge uncertainty. Methods to efficiently represent, aggregate, and propagate different types of uncertainty through computational models are clearly of vital importance. The most widely known and developed methods are available within the mathematics of probability theory, whether frequentist or subjectivist. Newer mathematical approaches, which extend or otherwise depart from probability theory, are also available, and are sometimes referred to as generalized information theory (GIT). For example, possibility theory, fuzzy set theory, and evidence theory are three components of GIT. To try to develop a better understanding of the relative advantages and disadvantages of traditional and newer methods and encourage a dialog between the risk assessment, reliability engineering, and GIT communities, a workshop was held. To focus discussion and debate at the workshop, a set of prototype problems, generally referred to as challenge problems, was constructed. The challenge problems concentrate on the representation, aggregation, and propagation of epistemic uncertainty and mixtures of epistemic and aleatory uncertainty through two simple model systems. This paper describes the challenge problems and gives numerical values for the different input parameters so that results from different investigators can be directly compared.  相似文献   

13.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

14.
Error and uncertainty in modeling and simulation   总被引:1,自引:0,他引:1  
This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.  相似文献   

15.
A very general and robust approach to solving optimization problems involving probabilistic uncertainty is through the use of Probabilistic Ordinal Optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the probabilistic merits of local design alternatives, rather than on precise quantification of the alternatives. Thus, we simply ask the question: “Is that alternative better or worse than this one?” to some level of statistical confidence we require, not: “HOW MUCH better or worse is that alternative to this one?”. In this paper we illustrate an elementary application of probabilistic ordinal concepts in a 2-D optimization problem. Two uncertain variables contribute to uncertainty in the response function. We use a simple Coordinate Pattern Search non-gradient-based optimizer to step toward the statistical optimum in the design space. We also discuss more sophisticated implementations, and some of the advantages and disadvantages versus other approaches to optimization under uncertainty.  相似文献   

16.
17.
Many safety assessments depend upon models that rely on probabilistic characterizations about which there is incomplete knowledge. For example, a system model may depend upon the time to failure of a piece of equipment for which no failures have actually been observed. The analysts in this case are faced with the task of developing a failure model for the equipment in question, having very limited knowledge about either the correct form of the failure distribution or the statistical parameters that characterize the distribution. They may assume that the process conforms to a Weibull or log-normal distribution or that it can be characterized by a particular mean or variance, but those assumptions impart more knowledge to the analysis than is actually available. To address this challenge, we propose a method where random variables comprising equivalence classes constrained by the available information are approximated using polynomial chaos expansions (PCEs). The PCE approximations are based on rigorous mathematical concepts developed from functional analysis and measure theory. The method has been codified in a computational tool, AVOCET, and has been applied successfully to example problems. Results indicate that it should be applicable to a broad range of engineering problems that are characterized by both irreducible andreducible uncertainty.  相似文献   

18.
This paper proposes an efficient metamodeling approach for uncertainty quantification of complex system based on Gaussian process model (GPM). The proposed GPM‐based method is able to efficiently and accurately calculate the mean and variance of model outputs with uncertain parameters specified by arbitrary probability distributions. Because of the use of GPM, the closed form expressions of mean and variance can be derived by decomposing high‐dimensional integrals into one‐dimensional integrals. This paper details on how to efficiently compute the one‐dimensional integrals. When the parameters are either uniformly or normally distributed, the one‐dimensional integrals can be analytically evaluated, while when parameters do not follow normal or uniform distributions, this paper adopts the effective Gaussian quadrature technique for the fast computation of the one‐dimensional integrals. As a result, the developed GPM method is able to calculate mean and variance of model outputs in an efficient manner independent of parameter distributions. The proposed GPM method is applied to a collection of examples. And its accuracy and efficiency is compared with Monte Carlo simulation, which is used as benchmark solution. Results show that the proposed GPM method is feasible and reliable for efficient uncertainty quantification of complex systems in terms of the computational accuracy and efficiency. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
Principal Component Analysis (PCA) is a well-known technique, the aim of which is to synthesize huge amounts of numerical data by means of a low number of unobserved variables, called components. In this paper, an extension of PCA to deal with interval valued data is proposed. The method, called Midpoint Radius Principal Component Analysis (MR-PCA), recovers the underlying structure of interval valued data by using both the midpoints (or centers) and the radii (a measure of the interval width) information. In order to analyze how MR-PCA works, the results of a simulation study and two applications on chemical data are proposed.  相似文献   

20.
Y. R. Fan  Y. P. Li 《工程优选》2013,45(11):1321-1336
In this study, a robust interval linear programming (RILP) method is developed for dealing with uncertainties expressed as intervals with deterministic boundaries. An enhanced two-step method (ETSM) is also advanced to solve the RILP model. The developed RILP improves upon the conventional interval linear programming (ILP) method since it can generate solution intervals within a larger feasible zone. The decision space based on ETSM contains all feasible solutions, such that no useful information is neglected. Moreover, the RILP can guarantee the stability of the optimization model due to no violation for the best-case constraints. The results also suggest that the RILP is effective for practical environmental and engineering problems that involve uncertainties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号