首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The first motivation of this work is to take into account model uncertainty in sensitivity analysis (SA). We present with some examples, a methodology to treat uncertainty due to a mutation of the studied model. Development of this methodology has highlighted an important problem, frequently encountered in SA: how to interpret sensitivity indices when random inputs are non-independent? This paper suggests a strategy for the problem of SA of models with non-independent random inputs. We propose a new application of the multidimensional generalization of classical sensitivity indices, resulting from group sensitivities (sensitivity of the output of the model to a group of inputs), and describe an estimation method based on Monte-Carlo simulations. Practical and theoretical applications illustrate the interest of this method.  相似文献   

2.
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSIC). Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify their impact on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a Monte Carlo double-loop, requires a large number of model evaluations, which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a Monte Carlo single-loop with a limited calculation budget. First, we build a unique sample of inputs and simulator outputs, from a well-chosen probability distribution of inputs. From this sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Subsequently, we define 2nd-level HSIC-based measures between the distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.  相似文献   

3.
Moment‐independent regional sensitivity analysis (RSA) is a very useful guide tool for assessing the effect of a specific range of an individual input on the uncertainty of model output, while large computational burden is involved to perform RSA, which would certainty lead to the limitation of engineering application. Main tasks for performing RSA are to estimate the probability density function (PDF) of model output and the joint PDF of model output and the input variable by some certain smart techniques. Firstly, a method based on the concepts of maximum entropy, fractional moment and sparse grid integration is utilized to estimate the PDF of the model output. Secondly, Nataf transformation is applied to obtain the joint PDF of model output and the input variable. Finally, according to an integral transformation, those regional sensitivity indices can be easily computed by a Monte Carlo procedure without extra function evaluations. Because all the PDFs can be estimated with great efficiency, and only a small amount of function evaluations are involved in the whole process, the proposed method can greatly decrease the computational burden. Several examples with explicit or implicit input–output relations are introduced to demonstrate the accuracy and efficiency of the proposed method. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

4.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

5.
Global sensitivity analysis using polynomial chaos expansions   总被引:13,自引:0,他引:13  
Global sensitivity analysis (SA) aims at quantifying the respective effects of input random variables (or combinations thereof) onto the variance of the response of a physical or mathematical model. Among the abundant literature on sensitivity measures, the Sobol’ indices have received much attention since they provide accurate information for most models. The paper introduces generalized polynomial chaos expansions (PCE) to build surrogate models that allow one to compute the Sobol’ indices analytically as a post-processing of the PCE coefficients. Thus the computational cost of the sensitivity indices practically reduces to that of estimating the PCE coefficients. An original non intrusive regression-based approach is proposed, together with an experimental design of minimal size. Various application examples illustrate the approach, both from the field of global SA (i.e. well-known benchmark problems) and from the field of stochastic mechanics. The proposed method gives accurate results for various examples that involve up to eight input random variables, at a computational cost which is 2–3 orders of magnitude smaller than the traditional Monte Carlo-based evaluation of the Sobol’ indices.  相似文献   

6.
7.
Process capability indices such as Cp, Cpk, Cpmk and Cpm are widely used in manufacturing industries to provide a quantitative measurement of the performance of the products. In this article, we derived generalized confidence intervals for the difference between process capability indices for two processes under one‐way random effect model. Our study provides coverage probability close to the nominal value in almost all cases as shown via simulation. An example from industrial contexts is given to illustrate the results. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

8.
This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions.  相似文献   

9.
10.
Non-probabilistic convex models need to be provided only the changing boundary of parameters rather than their exact probability distributions; thus, such models can be applied to uncertainty analysis of complex structures when experimental information is lacking. The interval and the ellipsoidal models are the two most commonly used modeling methods in the field of non-probabilistic convex modeling. However, the former can only deal with independent variables, while the latter can only deal with dependent variables. This paper presents a more general non-probabilistic convex model, the multidimensional parallelepiped model. This model can include the independent and dependent uncertain variables in a unified framework and can effectively deal with complex ‘multi-source uncertainty’ problems in which dependent variables and independent variables coexist. For any two parameters, the concepts of the correlation angle and the correlation coefficient are defined. Through the marginal intervals of all the parameters and also their correlation coefficients, a multidimensional parallelepiped can easily be built as the uncertainty domain for parameters. Through the introduction of affine coordinates, the parallelepiped model in the original parameter space is converted to an interval model in the affine space, thus greatly facilitating subsequent structural uncertainty analysis. The parallelepiped model is applied to structural uncertainty propagation analysis, and the response interval of the structure is obtained in the case of uncertain initial parameters. Finally, the method described in this paper was applied to several numerical examples. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

11.
A technique to perform design calculations on imprecise representations of parameters using the calculus of fuzzy sets has been previously developed [25]. An analogous approach to representing and manipulatinguncertainty in choosing among alternatives (design imprecision) using probability calculus is presented and compared with the fuzzy calculus technique. Examples using both approaches are presented, where the examples represent a progression from simple operations to more complex design equations. Results of the fuzzy sets and probability methods for the examples are shown graphically. We find that the fuzzy calculus is well suited to representing and manipulating the imprecision aspect of uncertainty in design, and that probability is best used to represent stochastic uncertainty.  相似文献   

12.
Simple algorithms are presented to compute both the exact probability of M or more out of N independent input events given unequal probabilities and, when there is uncertainty in the input event probabilities, the associated variance in the M-out-of-N probability. The performance of the M-out-of-N probability algorithm is on the order of N2 in time and N in space. The performance of the variance algorithm is N3 in time and N2 in space. The algorithms are not based on cut set methodology and, consequently, are not limited by the combinatorial explosion associated with cut set manipulation for the M-out-of-N gate. The algorithms are most useful when N exceeds the limitations of cut set manipulation techniques or the M-out-of-N probability is between 0·1 and 1, such that approximate quantification methods are inaccurate. In addition, the M-out-of-N probability is extended to permit the quantification of standard sensitivity and uncertainty importance measures for both individual input events and groups of input events. Example calculations illustrate the capabilities of the algorithms.  相似文献   

13.
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results.  相似文献   

14.
15.
For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δi. It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δi is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.  相似文献   

16.
A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution.  相似文献   

17.
Multivariate capability analysis has been the focus of study in recent years, during which many authors have proposed different multivariate capability indices. In the operative context, capability indices are used as measures of the ability of the process to operate according to specifications. Because the numerical value of the index is used to conclude about the capability of the process, it is essential to bear in mind that almost always that value is obtained from a sample of process units. Therefore, it is really necessary to know the properties that the indices have when they are calculated on sampling information, in order to assess the goodness of the inferences made from them. In this work, we conduct a simulation study to investigate distributional properties of two existing indices: NMCpm index based on ratio of volumes and Mp2 index based on principal component analysis. We analyze the relative bias and the mean square error of the estimators of the indices, and we also obtain their empirical distributions that are used to estimate the probability that the indices classify correctly a process as capable or as incapable. The results allow us to recommend the use of one of these indices, as it has shown better properties. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

18.
An efficient method is proposed to estimate the first‐order global sensitivity indices based on failure probability and variance by using maximum entropy theory and Nataf transformation. The computational cost of this proposed method is quite small, and the proposed method can efficiently overcome the ‘dimensional curse’ due to dimensional reduction technique. Ideas for the estimation of higher‐order sensitivity indices are discussed. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
A new uncertainty importance measure   总被引:19,自引:0,他引:19  
Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22–33] first introduced uncertainty importance measures.  相似文献   

20.
Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The “mean model” allows to estimate the sensitivity indices of each scalar model inputs, while the “dispersion model” allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号