首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Error and uncertainty in modeling and simulation   总被引:1,自引:0,他引:1  
This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.  相似文献   

2.
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project.  相似文献   

3.
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results.  相似文献   

4.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

5.
The following techniques for uncertainty and sensitivity analysis are briefly summarized: Monte Carlo analysis, differential analysis, response surface methodology, Fourier amplitude sensitivity test, Sobol' variance decomposition, and fast probability integration. Desirable features of Monte Carlo analysis in conjunction with Latin hypercube sampling are described in discussions of the following topics: (i) properties of random, stratified and Latin hypercube sampling, (ii) comparisons of random and Latin hypercube sampling, (iii) operations involving Latin hypercube sampling (i.e. correlation control, reweighting of samples to incorporate changed distributions, replicated sampling to test reproducibility of results), (iv) uncertainty analysis (i.e. cumulative distribution functions, complementary cumulative distribution functions, box plots), (v) sensitivity analysis (i.e. scatterplots, regression analysis, correlation analysis, rank transformations, searches for nonrandom patterns), and (vi) analyses involving stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty.  相似文献   

6.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

7.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

8.
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000 yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of subjective uncertainty is discussed, including assignment of distributions, uncertain variables selected for inclusion in analysis, correlation control, sample size, statistical confidence on mean complementary cumulative distribution functions, generation of Latin hypercube samples, sensitivity analysis techniques, and scenarios involving stochastic and subjective uncertainty.  相似文献   

9.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

10.
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000-yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of stochastic uncertainty is discussed, including drilling intrusion time, drilling location, penetration of excavated/nonexcavated areas of the repository, penetration of pressurized brine beneath the repository, borehole plugging patterns, activity level of waste, and occurrence of potash mining. Additional topics discussed include sampling procedures, generation of individual 10,000-yr futures for the WIPP, construction of complementary cumulative distribution functions (CCDFs), mechanistic calculations carried out to support CCDF construction, the Kaplan/Garrick ordered triple representation for risk, and determination of scenarios and scenario probabilities.  相似文献   

11.
Distribution Envelope Determination (DEnv) is a method for computing the CDFs of random variables whose samples are a function of samples of other random variable(s), termed inputs. DEnv computes envelopes around these CDFs when there is uncertainty about the precise form of the probability distribution describing any input. For example, inputs whose distribution functions have means and variances known only to within intervals can be handled. More generally, inputs can be handled if the set of all plausible cumulative distributions describing each input can be enclosed between left and right envelopes. Results will typically be in the form of envelopes when inputs are envelopes, when the dependency relationship of the inputs is unspecified, or both. For example in the case of specific input distribution functions with unspecified dependency relationships, each of the infinite number of possible dependency relationships would imply some specific output distribution, and the set of all such output distributions can be bounded with envelopes. The DEnv algorithm is a way to obtain the bounding envelopes. DEnv is implemented in a tool which is used to solve problems from a benchmark set.  相似文献   

12.
The paper describes an approach to representing, aggregating and propagating aleatory and epistemic uncertainty through computational models. The framework for the approach employs the theory of imprecise coherent probabilities. The approach is exemplified by a simple algebraic system, the inputs of which are uncertain. Six different uncertainty situations are considered, including mixtures of epistemic and aleatory uncertainty.  相似文献   

13.
Effects of uncertainties in gas damping models, geometry and mechanical properties on the dynamics of micro-electro-mechanical systems (MEMS) capacitive switch are studied. A sample of typical capacitive switches has been fabricated and characterized at Purdue University. High-fidelity simulations of gas damping on planar microbeams are developed and verified under relevant conditions. This and other gas damping models are then applied to study the dynamics of a single closing event for switches with experimentally measured properties. It has been demonstrated that although all damping models considered predict similar damping quality factor and agree well for predictions of closing time, the models differ by a factor of two and more in predicting the impact velocity and acceleration at contact. Implications of parameter uncertainties on the key reliability-related parameters such as the pull-in voltage, closing time and impact velocity are discussed. A notable effect of uncertainty is that the nominal switch, i.e. the switch with the average properties, does not actuate at the mean actuation voltage. Additionally, the device-to-device variability leads to significant differences in dynamics. For example, the mean impact velocity for switches actuated under the 90%-actuation voltage (about 150 V), i.e. the voltage required to actuate 90% of the sample, is about 129 cm/s and increases to 173 cm/s for the 99%-actuation voltage (of about 173 V). Response surfaces of impact velocity and closing time to five input variables were constructed using the Smolyak sparse grid algorithm. The sensitivity analysis showed that impact velocity is most sensitive to the damping coefficient whereas the closing time is most affected by the geometric parameters such as gap and beam thickness.  相似文献   

14.
The Epistemic Uncertainty Project of Sandia National Laboratories (NM, USA) proposed two challenge problems intended to assess the applicability and the relevant merits of modern mathematical theories of uncertainty in reliability engineering and risk analysis. This paper proposes a solution to Problem B: the response of a mechanical system with uncertain parameters. Random Set Theory is used to cope with both imprecision and dissonance affecting the available information. Imprecision results in an envelope of CDFs of the system response bounded by an upper CDF and a lower CDF. Different types of parameter discretizations are introduced. It is shown that: (i) when the system response presents extrema in the range of parameters considered, it is better to increase the fineness of the discretization than to invoke a global optimization tool; (ii) the response expectation differed by less than 0.5% when the number of function calls was increased 15.7 times; (iii) larger differences (4–5%) were obtained for the lower tails of the CDFs of the response. Further research is necessary to investigate (i) parameter discretizations aimed at increasing the accuracy of the CDFs (lower) tails; (ii) the role of correlation in combining information.  相似文献   

15.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

16.
In order to overcome the disadvantages of traditional deterministic models, a probabilistic bond strength model of reinforcement bar in concrete was presented. According to the partly cracked thick-walled cylinder model, a deterministic bond strength model of reinforcement bar in concrete was developed first by taking into account the influences of various important factors. Then the analytical expression of probabilistic bond strength model of reinforcement bar in concrete was derived by taking into consideration both aleatory and epistemic uncertainties. Subsequently, a probabilistic bond strength model of reinforcement bar in concrete was proposed by determining the statistical characteristics of probabilistic model parameters based on the Markov Chain Monte Carlo method and the Bayesian theory. Finally, applicability of the proposed probabilistic model were validated by comparing with 400 sets of experimental data and four typical deterministic bond strength models. Analysis shows that the probabilistic model provides efficient approaches to describe the probabilistic characteristics of bond strength and to calibrate traditional deterministic bond strength models.  相似文献   

17.
小区间参数不确定性高阶系统的振动控制研究   总被引:1,自引:0,他引:1       下载免费PDF全文
针对复杂高阶不确定性系统的鲁棒H∞振动控制难题,基于小区间参数不确定性系统降阶方法的研究,系统地给出了包含降阶对象的构建、不确定性系统的降阶、鲁棒控制器的求解以及鲁棒性验证等在内的一整套研究策略和分析方法。为保障整个分析和求解过程的顺利进行,分析了高阶矩阵指数函数一阶偏导数求解困难的根源以及利用数值微分法的解决策略,并给出了一类维数较小的不确定性矩阵分解方法。最后通过算例表明了整个分析过程和求解方法的合理性和可行性。  相似文献   

18.
Three applications of sampling-based sensitivity analysis in conjunction with evidence theory representations for epistemic uncertainty in model inputs are described: (i) an initial exploratory analysis to assess model behavior and provide insights for additional analysis; (ii) a stepwise analysis showing the incremental effects of uncertain variables on complementary cumulative belief functions and complementary cumulative plausibility functions; and (iii) a summary analysis showing a spectrum of variance-based sensitivity analysis results that derive from probability spaces that are consistent with the evidence space under consideration.  相似文献   

19.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

20.
This paper presents the results of a study on the response of structures with uncertain properties such as mass, stiffness and damping. The effect of the uncertain parameters on the response and the effect of the modelling of the uncertainties on the response are investigated. In particular, two types of uncertainties are distinguished: random and fuzzy uncertainties. Two kinds of models are studied: probabilistic and fuzzy set models. The two approaches to uncertainty modelling are compared with regard to their impacts on the analysis and on the uncertain structural response obtained. The study considers free vibration, forced vibration with deterministic excitation, and forced vibration with Gaussian white noise excitation. It is concluded that, in general, fuzzy models are much easier to implement and the associated analysis easier to perform than their probabilistic counterparts. When the available data on the structural parameters are crude and do not support a rigorous probabilistic model, the fuzzy set approach should be considered in view of its simplicity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号