首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δi. It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δi is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.  相似文献   

2.
A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution.  相似文献   

3.
This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell–Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA.  相似文献   

4.
5.
A cumulative distribution function (CDF)-based method has been used to perform sensitivity analysis on a computer model that conducts total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, and to identify the most influential input parameters affecting the output of the model. The performance assessment computer model referred to as the TPA code, was recently developed by the US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA), to evaluate the performance assessments conducted by the US department of energy (DOE) in support of their license application. The model uses a probabilistic framework implemented through Monte Carlo or Latin hypercube sampling (LHS) to permit the propagation of uncertainties associated with model parameters, conceptual models, and future system states. The problem involves more than 246 uncertain parameters (also referred to as random variables) of which the ones that have significant influence on the response or the uncertainty of the response must be identified and ranked. The CDF-based approach identifies and ranks important parameters based on the sensitivity of the response CDF to the input parameter distributions. Based on a reliability sensitivity concept [AIAA Journal 32 (1994) 1717], the response CDF is defined as the integral of the joint probability-density-function of the input parameters, with a domain of integration that is defined by a subset of the samples. The sensitivity analysis does not require explicit knowledge of any specific relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The method allows for calculating sensitivity over a wide range of the response and is not limited to the mean value.  相似文献   

6.
In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction.The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics.  相似文献   

7.
Uncertainty and sensitivity analysis for models with correlated parameters   总被引:2,自引:0,他引:2  
When conducting sensitivity and uncertainty analysis, most of the global sensitivity techniques assume parameter independence. However, it is common that the parameters are correlated with each other. For models with correlated inputs, we propose that the contribution of uncertainty to model output by an individual parameter be divided into two parts: the correlated contribution (by the correlated variations, i.e. variations of a parameter which are correlated with other parameters) and the uncorrelated contribution (by the uncorrelated variations, i.e. the unique variations of a parameter which cannot be explained by any other parameters). So far, only a few studies have been conducted to obtain the sensitivity index for a model with correlated input. But these studies do not distinguish between the correlated and uncorrelated contribution of a parameter. In this study, we propose a regression-based method to quantitatively decompose the total uncertainty in model output into partial variances contributed by the correlated variations and partial variances contributed by the uncorrelated variations. The proposed regression-based method is then applied in three test cases. Results show that the regression-based method can successfully measure the uncertainty contribution in the case where the relationship between response and parameters is approximately linear.  相似文献   

8.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

9.
Importance measures are integral parts of risk assessment for risk‐informed decision making. Because the parameters of a risk model, such as the component failure rates, are functions of time and a perturbation (change) in their values can occur during the mission time, time dependence must be considered in the evaluation of the importance measures. In this paper, it is shown that the change in system performance at time t, and consequently the importance of the parameters at time t, depends on the parameters perturbation time and their value functions during the system mission time. We consider a nonhomogeneous continuous time Markov model of a series‐parallel system to propose the mathematical proofs and simulations, while the ideas are also shown to be consistent with general models having nonexponential failure rates. Two new measures of importance and a simulation scheme for their computation are introduced to account for the effect of perturbation time and time‐varying parameters.  相似文献   

10.
Uncertainty analysis (UA) is the process that quantitatively identifies and characterizes the output uncertainty and has a crucial implication in engineering applications. The research of efficient estimation of structural output moments in probability space plays an important part in the UA and has great engineering significance. Given this point, a new UA method based on the Kriging surrogate model related to closed-form expressions for the perception of the estimation of mean and variance is proposed in this paper. The new proposed method is proven effective because of its direct reflection on the prediction uncertainty of the output moments of metamodel to quantify the accuracy level. The estimation can be completed by directly using the redefined closed-form expressions of the model’s output mean and variance to avoid excess post-processing computational costs and errors. Furthermore, a novel framework of adaptive Kriging estimating mean (AKEM) is demonstrated for more efficiently reducing uncertainty in the estimation of output moment. In the adaptive strategy of AKEM, a new learning function based on the closed-form expression is proposed. Based on the closed-form expression which modifies the computational error caused by the metamodeling uncertainty, the proposed learning function enables the updating of metamodel to reduce prediction uncertainty efficiently and realize the decrease in computational costs. Several applications are introduced to prove the effectiveness and efficiency of the AKEM compared with a universal adaptive Kriging method. Through the good performance of AKEM, its potential in engineering applications can be spotted.  相似文献   

11.
An uncertainty-based sensitivity index represents the contribution that uncertainty in model input Xi makes to the uncertainty in model output Y. This paper addresses the situation where the uncertainties in the model inputs are expressed as closed convex sets of probability measures, a situation that exists when inputs are expressed as intervals or sets of intervals with no particular distribution specified over the intervals, or as probability distributions with interval-valued parameters. Three different approaches to measuring uncertainty, and hence uncertainty-based sensitivity, are explored. Variance-based sensitivity analysis (VBSA) estimates the contribution that each uncertain input, acting individually or in combination, makes to variance in the model output. The partial expected value of perfect information (partial EVPI), quantifies the (financial) value of learning the true numeric value of an input. For both of these sensitivity indices the generalization to closed convex sets of probability measures yields lower and upper sensitivity indices. Finally, the use of relative entropy as an uncertainty-based sensitivity index is introduced and extended to the imprecise setting, drawing upon recent work on entropy measures for imprecise information.  相似文献   

12.
A novel procedure for estimating the relative importance of uncertain parameters of complex FE model is presented. The method is specifically directed toward problems involving high-dimensional input parameter spaces, as they are encountered during uncertainty analysis of large scale, refined FE models. In these cases one is commonly faced with thousands of uncertain parameters and traditional techniques, e.g. finite difference or direct differentiation methods become expensive. In contrast, the presented method quickly filters out the most influential variables. Hence, the main objective is not to compute the sensitivity but to identify those parameters whose random variations have the biggest influence on the response. This is achieved by generating a set of samples with direct Monte Carlo simulation, which are closely scattered around the point at which the relative importance measures are sought. From these samples, estimators of the relative importance are synthesized and the most important ones are refined with a method of choice. In this paper, the underlying theory as well as the resulting algorithm is presented.  相似文献   

13.
For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information.  相似文献   

14.
In the current quantification of fire probabilistic risk assessment (PRA), when components are damaged by a fire, the basic event values of the components become ‘true’ or one (1), which removes the basic events related to the components from the minimal cut sets, and which makes it difficult to calculate accurate component importance measures. A new method to accurately calculate an importance measure such as Fussell-Vesely in fire PRA is introduced in this paper. Also, a new quantification algorithm in the fire PRA model is proposed to support the new calculation method of the importance measures. The effectiveness of the new method in finding the importance measures is illustrated with an example of evaluating cables’ importance.  相似文献   

15.
Many dynamic models are used for risk assessment and decision support in ecology and crop science. Such models generate time-dependent model predictions, with time either discretised or continuous. Their global sensitivity analysis is usually applied separately on each time output, but Campbell et al. (2006 [1]) advocated global sensitivity analyses on the expansion of the dynamics in a well-chosen functional basis. This paper focuses on the particular case when principal components analysis is combined with analysis of variance. In addition to the indices associated with the principal components, generalised sensitivity indices are proposed to synthesize the influence of each parameter on the whole time series output. Index definitions are given when the uncertainty on the input factors is either discrete or continuous and when the dynamic model is either discrete or functional. A general estimation algorithm is proposed, based on classical methods of global sensitivity analysis.The method is applied to a dynamic wheat crop model with 13 uncertain parameters. Three methods of global sensitivity analysis are compared: the Sobol'-Saltelli method, the extended FAST method, and the fractional factorial design of resolution 6.  相似文献   

16.
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSIC). Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify their impact on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a Monte Carlo double-loop, requires a large number of model evaluations, which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a Monte Carlo single-loop with a limited calculation budget. First, we build a unique sample of inputs and simulator outputs, from a well-chosen probability distribution of inputs. From this sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Subsequently, we define 2nd-level HSIC-based measures between the distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.  相似文献   

17.
Moment‐independent regional sensitivity analysis (RSA) is a very useful guide tool for assessing the effect of a specific range of an individual input on the uncertainty of model output, while large computational burden is involved to perform RSA, which would certainty lead to the limitation of engineering application. Main tasks for performing RSA are to estimate the probability density function (PDF) of model output and the joint PDF of model output and the input variable by some certain smart techniques. Firstly, a method based on the concepts of maximum entropy, fractional moment and sparse grid integration is utilized to estimate the PDF of the model output. Secondly, Nataf transformation is applied to obtain the joint PDF of model output and the input variable. Finally, according to an integral transformation, those regional sensitivity indices can be easily computed by a Monte Carlo procedure without extra function evaluations. Because all the PDFs can be estimated with great efficiency, and only a small amount of function evaluations are involved in the whole process, the proposed method can greatly decrease the computational burden. Several examples with explicit or implicit input–output relations are introduced to demonstrate the accuracy and efficiency of the proposed method. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

18.
We present two methods for the estimation of main effects in global sensitivity analysis. The methods adopt Satterthwaite's application of random balance designs in regression problems, and extend it to sensitivity analysis of model output for non-linear, non-additive models. Finite as well as infinite ranges for model input factors are allowed. The methods are easier to implement than any other method available for global sensitivity analysis, and reduce significantly the computational cost of the analysis. We test their performance on different test cases, including an international benchmark on safety assessment for nuclear waste disposal originally carried out by OECD/NEA.  相似文献   

19.
A predictive model is constructed for a radiative shock experiment, using a combination of a physics code and experimental measurements. The CRASH code can model the radiation hydrodynamics of the radiative shock launched by the ablation of a Be drive disk and driven down a tube filled with Xe. The code is initialized by a preprocessor that uses data from the Hyades code to model the initial 1.3 ns of the system evolution, with this data fit over seven input parameters by a Gaussian process model. The CRASH code output for shock location from 320 simulations is modeled by another Gaussian process model that combines the simulation data with eight field measurements of a CRASH experiment, and uses this joint model to construct a posterior distribution for the physical parameters of the simulation (model calibration). This model can then be used to explore sensitivity of the system to the input parameters. Comparison of the predicted shock locations in a set of leave-one-out exercises shows that the calibrated model can predict the shock location within experimental uncertainty.  相似文献   

20.
Several importance measures are identified for possible use in the performance assessment of a high-level nuclear waste repository. These importance measures are based on concepts of importance used in system reliability analysis, but the concepts are modified and adapted to the special characteristics of the repository and similar passive systems. In particular, the importance measures proposed here are based on risk (in comparison to traditional importance measures which are based on frequency of failure) and are intended to be more suitable to systems comprised of components whose behavior is most easily and naturally represented as continuous, rather than binary. These importance measures appear to be able to evaluate systems comprised of both continuous-behavior and binary-behavior components. Three separate examples are provided to illustrate the concepts and behavior of these importance measures. The first example demonstrates various formulations for the importance measures and their implementation for a simple radiation safety system comprised of a radiation source and three shields. The second example demonstrates use of these importance measures for a system comprised of components modeled with binary behavior and components modeled with continuous behavior. The third example investigates the use of these importance measures for a proposed repository system, using a total system model and code currently under development. Currently, these concepts and formulations of importance are undergoing further evaluation for a repository system to determine to what degree they provide useful insights and to determine which formulations are most useful.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号