首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results.  相似文献   

2.
Performance assessment of complex systems is ideally done through full system-level testing which is seldom available for high consequence systems. Further, a reality of engineering practice is that some features of system behavior are not known from experimental data, but from expert assessment, only. On the other hand, individual component data, which are part of the full system are more readily available. The lack of system level data and the complexity of the system lead to a need to build computational models of a system in a hierarchical or building block approach (from simple components to the full system). The models are then used for performance prediction in lieu of experiments, to estimate the confidence in the performance of these systems. Central to this are the need to quantify the uncertainties present in the system and to compare the system response to an expected performance measure. This is the basic idea behind Quantification of Margins and Uncertainties (QMU). QMU is applied in decision making—there are many uncertainties caused by inherent variability (aleatoric) in materials, configurations, environments, etc., and lack of information (epistemic) in models for deterministic and random variables that influence system behavior and performance. This paper proposes a methodology to quantify margins and uncertainty in the presence of both aleatoric and epistemic uncertainty. It presents a framework based on Bayes networks to use available data at multiple levels of complexity (i.e. components, subsystem, etc.) and demonstrates a method to incorporate epistemic uncertainty given in terms of intervals on a model parameter.  相似文献   

3.
Effects of uncertainties in gas damping models, geometry and mechanical properties on the dynamics of micro-electro-mechanical systems (MEMS) capacitive switch are studied. A sample of typical capacitive switches has been fabricated and characterized at Purdue University. High-fidelity simulations of gas damping on planar microbeams are developed and verified under relevant conditions. This and other gas damping models are then applied to study the dynamics of a single closing event for switches with experimentally measured properties. It has been demonstrated that although all damping models considered predict similar damping quality factor and agree well for predictions of closing time, the models differ by a factor of two and more in predicting the impact velocity and acceleration at contact. Implications of parameter uncertainties on the key reliability-related parameters such as the pull-in voltage, closing time and impact velocity are discussed. A notable effect of uncertainty is that the nominal switch, i.e. the switch with the average properties, does not actuate at the mean actuation voltage. Additionally, the device-to-device variability leads to significant differences in dynamics. For example, the mean impact velocity for switches actuated under the 90%-actuation voltage (about 150 V), i.e. the voltage required to actuate 90% of the sample, is about 129 cm/s and increases to 173 cm/s for the 99%-actuation voltage (of about 173 V). Response surfaces of impact velocity and closing time to five input variables were constructed using the Smolyak sparse grid algorithm. The sensitivity analysis showed that impact velocity is most sensitive to the damping coefficient whereas the closing time is most affected by the geometric parameters such as gap and beam thickness.  相似文献   

4.
Stochastic models are considered for choosing primary cataloging objects for federal government purposes, which incorporate the actual uncertainty occurring in cataloging products. __________ Translated from Izmeritel’naya Tekhnika, No. 2, pp. 19–22, February, 2007.  相似文献   

5.
The risk assessment community has begun to make a clear distinction between aleatory and epistemic uncertainty in theory and in practice. Aleatory uncertainty is also referred to in the literature as variability, irreducible uncertainty, inherent uncertainty, and stochastic uncertainty. Epistemic uncertainty is also termed reducible uncertainty, subjective uncertainty, and state-of-knowledge uncertainty. Methods to efficiently represent, aggregate, and propagate different types of uncertainty through computational models are clearly of vital importance. The most widely known and developed methods are available within the mathematics of probability theory, whether frequentist or subjectivist. Newer mathematical approaches, which extend or otherwise depart from probability theory, are also available, and are sometimes referred to as generalized information theory (GIT). For example, possibility theory, fuzzy set theory, and evidence theory are three components of GIT. To try to develop a better understanding of the relative advantages and disadvantages of traditional and newer methods and encourage a dialog between the risk assessment, reliability engineering, and GIT communities, a workshop was held. To focus discussion and debate at the workshop, a set of prototype problems, generally referred to as challenge problems, was constructed. The challenge problems concentrate on the representation, aggregation, and propagation of epistemic uncertainty and mixtures of epistemic and aleatory uncertainty through two simple model systems. This paper describes the challenge problems and gives numerical values for the different input parameters so that results from different investigators can be directly compared.  相似文献   

6.
首先对材料试验机示值误差进行了分析,然后根据所采用的不同检定方法,构造了两种误差分析的数学模型,并在其中一种数学模型的基础上进行了不确定度来源的确定,从中找到了不确定度的主要来源是由标准测力仪、检测时数据的不重复性、指示装置的分辨率等三方面所引入的,进一步对各种来源进行量化分析,对总不确定度进行合成,得到一套合理化的评定方法,最后通过实例进行了说明。  相似文献   

7.
Model structure uncertainty, originating from assumptions and idealizations in modelling processes, is a form of uncertainty that is often hard to quantify. In this article, the authors propose and demonstrate a method, the inductive design exploration method (IDEM), which facilitates robust design in the presence of model structure uncertainty. The approach in this method is achieving robustness by compromising between the degree of system performance and the degree of reliability based on structure uncertainty associated with system models (i.e. models for performances and constraints). The main strategies in the IDEM include: (i) identifying feasible ranged sets of design space instead of single (or optimized) design solution, considering all types of quantifiable uncertainties and (ii) systematically compromising target achievement with provision for potential uncertainty. The IDEM is successfully demonstrated in a clay-filled polyethylene cantilever beam design example, which is a simple but representative example of integrated materials and product design problems.  相似文献   

8.
Bayesian uncertainty analysis with applications to turbulence modeling   总被引:2,自引:0,他引:2  
In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.  相似文献   

9.
Microstructural models of soft-tissue deformation are important in applications including artificial tissue design and surgical planning. The basis of these models, and their advantage over their phenomenological counterparts, is that they incorporate parameters that are directly linked to the tissue’s microscale structure and constitutive behaviour and can therefore be used to predict the effects of structural changes to the tissue. Although studies have attempted to determine such parameters using diverse, state-of-the-art, experimental techniques, values ranging over several orders of magnitude have been reported, leading to uncertainty in the true parameter values and creating a need for models that can handle such uncertainty. We derive a new microstructural, hyperelastic model for transversely isotropic soft tissues and use it to model the mechanical behaviour of tendons. To account for parameter uncertainty, we employ a Bayesian approach and apply an adaptive Markov chain Monte Carlo algorithm to determine posterior probability distributions for the model parameters. The obtained posterior distributions are consistent with parameter measurements previously reported and enable us to quantify the uncertainty in their values for each tendon sample that was modelled. This approach could serve as a prototype for quantifying parameter uncertainty in other soft tissues.  相似文献   

10.
The paper describes an approach to representing, aggregating and propagating aleatory and epistemic uncertainty through computational models. The framework for the approach employs the theory of imprecise coherent probabilities. The approach is exemplified by a simple algebraic system, the inputs of which are uncertain. Six different uncertainty situations are considered, including mixtures of epistemic and aleatory uncertainty.  相似文献   

11.
Error and uncertainty in modeling and simulation   总被引:1,自引:0,他引:1  
This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.  相似文献   

12.
Decision-makers have been shown to rely on probabilistic models for perception and action. However, these models can be incorrect or partially wrong in which case the decision-maker has to cope with model uncertainty. Model uncertainty has recently also been shown to be an important determinant of sensorimotor behaviour in humans that can lead to risk-sensitive deviations from Bayes optimal behaviour towards worst-case or best-case outcomes. Here, we investigate the effect of model uncertainty on cooperation in sensorimotor interactions similar to the stag-hunt game, where players develop models about the other player and decide between a pay-off-dominant cooperative solution and a risk-dominant, non-cooperative solution. In simulations, we show that players who allow for optimistic deviations from their opponent model are much more likely to converge to cooperative outcomes. We also implemented this agent model in a virtual reality environment, and let human subjects play against a virtual player. In this game, subjects'' pay-offs were experienced as forces opposing their movements. During the experiment, we manipulated the risk sensitivity of the computer player and observed human responses. We found not only that humans adaptively changed their level of cooperation depending on the risk sensitivity of the computer player but also that their initial play exhibited characteristic risk-sensitive biases. Our results suggest that model uncertainty is an important determinant of cooperation in two-player sensorimotor interactions.  相似文献   

13.
Estimating uncertainty in model predictions is a central task in quantitative biology. Biological models at the single-cell level are intrinsically stochastic and nonlinear, creating formidable challenges for their statistical estimation which inevitably has to rely on approximations that trade accuracy for tractability. Despite intensive interest, a sweet spot in this trade-off has not been found yet. We propose a flexible procedure for uncertainty quantification in a wide class of reaction networks describing stochastic gene expression including those with feedback. The method is based on creating a tractable coarse-graining of the model that is learned from simulations, a synthetic model, to approximate the likelihood function. We demonstrate that synthetic models can substantially outperform state-of-the-art approaches on a number of non-trivial systems and datasets, yielding an accurate and computationally viable solution to uncertainty quantification in stochastic models of gene expression.  相似文献   

14.
Uncertainty, probability and information-gaps   总被引:1,自引:0,他引:1  
This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model.Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success?We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems.  相似文献   

15.
Two squeeze‐film gas damping models are proposed to quantify uncertainties associated with the gap size and the ambient pressure. Modeling of gas damping has become a subject of increased interest in recent years due to its importance in micro‐electro‐mechanical systems (MEMS). In addition to the need for gas damping models for design of MEMS with movable micro‐structures, knowledge of parameter dependence in gas damping contributes to the understanding of device‐level reliability. In this work, two damping models quantifying the uncertainty in parameters are generated based on rarefied flow simulations. One is a generalized polynomial chaos (gPC) model, which is a general strategy for uncertainty quantification, and the other is a compact model developed specifically for this problem in an early work. Convergence and statistical analysis have been conducted to verify both models. By taking the gap size and ambient pressure as random fields with known probability distribution functions (PDF), the output PDF for the damping coefficient can be obtained. The first four central moments are used in comparisons of the resulting non‐parametric distributions. A good agreement has been found, within 1%, for the relative difference for damping coefficient mean values. In study of geometric uncertainty, it is found that the average damping coefficient can deviate up to 13% from the damping coefficient corresponding to the average gap size. The difference is significant at the nonlinear region where the flow is in slip or transitional rarefied regimes. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

16.
Non-deterministic lot-sizing models are considered which serve for an explicit determination of lot sizes in an uncertain environment. Taxonomy components for such models are suggested and a bibliography structured according to these components is presented. The taxonomy components are numeric characteristics of a lot-sizing problem, names of uncertain parameters and names of approaches to model the uncertainty. The bibliography covers more than 300 publications since the year 2000.  相似文献   

17.
为在产品检验前合理预估测量不确定度对批量产品检验结果的影响,分别面向全数检验和抽样检验方法,以误判率为指标,量化表示产品供求双方风险;基于绝对概率和条件概率,建立全数检验误判风险模型,在此基础上,推导抽样检验误判率计算公式。实例分析结果表明,提出模型可综合反映测量不确定度引起的误判风险;基于绝对概率模型的误判率计算结果,可作为产品检验测量方案选择的依据;基于条件概率模型的误判率计算结果,可更直观地反映产品供求双方风险,促使检验人员更为慎重地进行合格性判定。  相似文献   

18.
After the catastrophic disaster brought by Typhoon Morakot in 2009, the enhancement of flood warning technology cannot wait in Taiwan. In recent years, ensemble flood warning has exhibited advantages in extending lead time, quantifying uncertainty and raising confidence in issuing warnings. Unlike most ensembles aimed at integrating meteorological variations, this study generates the ensemble through the combination of multiple conceptually different hydrological models in order to avoid possible bias by applying a single model for a flood forecast. Taking Typhoon Morakot as the study case, the townships in Chiayi City/County are selected as the study areas to compare the performance of ensemble warning with that given by individual models. The results indicate that the ensemble warning shows better accuracy than individual models by giving higher overall correctness, revealing the fact that hydrological ensemble is no less important than meteorological ensemble in acquiring better flood warning performance.  相似文献   

19.
20.
传统的气动弹性系统颤振分析模型大多是在确定性参数条件下建立的,当系统中存在不确定因素时,按确定性方法设计的气动弹性系统存在颤振失效风险.以概率和非概率区间模型为基础,建立了单源不确定性条件下颤振可靠性分析模型;在此基础上,针对含随机和区间多源不确定参数的气动弹性系统颤振可靠性分析问题,提出一种基于分步求解策略的新型混合...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号