首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A new importance measure for risk-informed decision making   总被引:1,自引:0,他引:1  
In this paper, we introduce a new importance measure, the differential importance measure (DIM), for probabilistic safety assessment (PSA). DIM responds to the need of the analyst/decision maker to get information about the importance of proposed changes that affect component properties and multiple basic events. DIM is directly applicable to both the basic events and the parameters of the PSA model. Unlike the Fussell–Vesely (FV), risk achievement worth (RAW), Birnbaum, and criticality importance measures, DIM is additive, i.e. the DIM of groups of basic events or parameters is the sum of the individual DIMs. We discuss the difference between DIM and other local sensitivity measures that are based on normalized partial derivatives. An example is used to demonstrate the evaluation of DIM at both the basic event and the parameter level. To compare the results obtained with DIM at the parameter level, an extension of the definitions of FV and RAW is necessary. We discuss possible extensions and compare the results of the three measures for a more realistic example.  相似文献   

2.
A new uncertainty importance measure   总被引:19,自引:0,他引:19  
Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22–33] first introduced uncertainty importance measures.  相似文献   

3.
4.
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSIC). Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify their impact on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a Monte Carlo double-loop, requires a large number of model evaluations, which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a Monte Carlo single-loop with a limited calculation budget. First, we build a unique sample of inputs and simulator outputs, from a well-chosen probability distribution of inputs. From this sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Subsequently, we define 2nd-level HSIC-based measures between the distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.  相似文献   

5.
A novel procedure for estimating the relative importance of uncertain parameters of complex FE model is presented. The method is specifically directed toward problems involving high-dimensional input parameter spaces, as they are encountered during uncertainty analysis of large scale, refined FE models. In these cases one is commonly faced with thousands of uncertain parameters and traditional techniques, e.g. finite difference or direct differentiation methods become expensive. In contrast, the presented method quickly filters out the most influential variables. Hence, the main objective is not to compute the sensitivity but to identify those parameters whose random variations have the biggest influence on the response. This is achieved by generating a set of samples with direct Monte Carlo simulation, which are closely scattered around the point at which the relative importance measures are sought. From these samples, estimators of the relative importance are synthesized and the most important ones are refined with a method of choice. In this paper, the underlying theory as well as the resulting algorithm is presented.  相似文献   

6.
7.
Many dynamic models are used for risk assessment and decision support in ecology and crop science. Such models generate time-dependent model predictions, with time either discretised or continuous. Their global sensitivity analysis is usually applied separately on each time output, but Campbell et al. (2006 [1]) advocated global sensitivity analyses on the expansion of the dynamics in a well-chosen functional basis. This paper focuses on the particular case when principal components analysis is combined with analysis of variance. In addition to the indices associated with the principal components, generalised sensitivity indices are proposed to synthesize the influence of each parameter on the whole time series output. Index definitions are given when the uncertainty on the input factors is either discrete or continuous and when the dynamic model is either discrete or functional. A general estimation algorithm is proposed, based on classical methods of global sensitivity analysis.The method is applied to a dynamic wheat crop model with 13 uncertain parameters. Three methods of global sensitivity analysis are compared: the Sobol'-Saltelli method, the extended FAST method, and the fractional factorial design of resolution 6.  相似文献   

8.
Complex chemical mechanisms are increasingly used within models describing a range of important chemical processes. Parameters describing the rates of chemical steps and thermodynamics may be highly uncertain, influencing the uncertainty in final model predictions. Local sensitivity analysis is traditionally employed within commercial modelling packages but may not be appropriate for highly uncertain data within non-linear models. There is a need for global uncertainty techniques such as Morris and Monte Carlo methods that can be applied efficiently for computationally expensive models. This paper presents the development of such techniques, along with application to a kinetic mechanism describing the influence of fuel trace elements such as sulphur-containing compounds, on the formation of nitrogen oxide in combustion devices. The analysis evaluates the parameters from within the current sulphur scheme that drive uncertainties in predicted relative changes in nitrogen oxide concentrations when sulphur compounds are added to the fuel. The overall performance of the mechanism is evaluated in comparison with available experimental profiles and the level of agreement between different methods for importance ranking of the rate parameters is highlighted. The use of fitted model representations is also discussed as an alternative method for determining importance ranking, and highlights non-linear interactions between parameters. Finally, possible improvements to the chemical scheme are tested within a Monte Carlo framework under lean flame conditions, where the current mechanism performs the least well with respect to experimental results.  相似文献   

9.
Uncertainty and sensitivity analysis for models with correlated parameters   总被引:2,自引:0,他引:2  
When conducting sensitivity and uncertainty analysis, most of the global sensitivity techniques assume parameter independence. However, it is common that the parameters are correlated with each other. For models with correlated inputs, we propose that the contribution of uncertainty to model output by an individual parameter be divided into two parts: the correlated contribution (by the correlated variations, i.e. variations of a parameter which are correlated with other parameters) and the uncorrelated contribution (by the uncorrelated variations, i.e. the unique variations of a parameter which cannot be explained by any other parameters). So far, only a few studies have been conducted to obtain the sensitivity index for a model with correlated input. But these studies do not distinguish between the correlated and uncorrelated contribution of a parameter. In this study, we propose a regression-based method to quantitatively decompose the total uncertainty in model output into partial variances contributed by the correlated variations and partial variances contributed by the uncorrelated variations. The proposed regression-based method is then applied in three test cases. Results show that the regression-based method can successfully measure the uncertainty contribution in the case where the relationship between response and parameters is approximately linear.  相似文献   

10.
Local and global uncertainty analysis of complex chemical kinetic systems   总被引:3,自引:0,他引:3  
Computer modelling plays a crucial part in the understanding of complex chemical reactions. Parameters of elementary chemical and physical processes are usually determined in independent experiments and are always associated with uncertainties. Two typical examples of complex chemical kinetic systems are the combustion of gases and the photochemical processes in the atmosphere. In this study, local uncertainty analysis, the Morris method, and Monte Carlo analysis with Latin hypercube sampling were applied to an atmospheric and to a combustion model. These models had 45 and 37 variables along with 141 and 212 uncertain parameters, respectively. The toolkit used here consists of complementary methods and is able to map both the sources and the magnitudes of uncertainties. In the case of the combustion model, the global uncertainties of the local sensitivity coefficients were also investigated, and the order of parameter importance based on local sensitivities were found to be almost independent of the parameter values within their range of uncertainty.  相似文献   

11.
For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δi. It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δi is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.  相似文献   

12.

全局灵敏度分析,旨在考量结构系统中各输入随机变量对输出响应不确定性及风险水平影响的重要度。它能为后续的可靠度评估、故障诊断、系统设计、预测及优化等提供重要参考。尽管各类全局灵敏度分析方法不断涌现,但高维复杂结构(如风机叶片结构)灵敏度分析仍是目前的难题。该文针对空间分割全局灵敏度分析方法的三种可能计算形式及最优分割方案展开研究,通过标准算例分析和误差理论推导,提出能充分利用样本信息、有效减轻计算负担的求解形式及分割方案,并将其应用于风机叶片极限载荷工况下的全局灵敏度分析中,同时为未来设计更为高效、经济和可靠的风机结构提供参考。

  相似文献   

13.
One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified.  相似文献   

14.
15.
Large and complex macro-micro coupled constitutive models, which describe metal flow and microstructure evolution during metal forming, are sometimes overparameterized with respect to given sets of experimental datum. This results in poorly identifiable or non-identifiable model parameters. In this paper, a systemic parameter identification method for the large macro-micro coupled constitutive models is proposed. This method is based on the global and local identifiability analysis, in which two identifiability measures are adopted. The first measure accounts for the sensitivity of model results with respect to single parameters, and the second measure accounts for the degree of near-linear dependence of sensitivity functions of parameter subsets. The global identifiability analysis adopts a sampling strategy with only a limited number of model evaluations, and the strategy is a combination of Latin-hypercube sampling, one-factor-at-a-time sampling and elitism preservation strategy. The global identifiability index is the integration of the corresponding local index. A hybrid global optimization method is designed to identify the parameter. Firstly, the genetic algorithm is adopted to identify the model parameter rudely, and then the obtained parameter is further refined through the improved Levenberg-Marquardt algorithm. The niching method is used to maintain the population diversity and to choose the initial value for the Levenberg-Marquardt algorithm. A transition criterion between the genetic algorithm and the Levenberg-Marquardt algorithm is proposed, through the improvement on the average objective function value of the chromosomes and the objective function value of the best chromosome. During optimization by the Levenberg-Marquardt algorithm, the local identifiability analysis is taken at the beginning stage of each iteration, and then the variable with poor identifiability remains unchanged in this iteration; the problem of violation constraint for some solution is solved through adjusting the search step length. At last, taking Ti-6Al-4V as an example, a set of satisfactory material parameters is obtained. The calculated results agree with the experimental results well. The identified results show that some parameters involved in the model are poorly identifiable; at the same time, the identifiability analysis method can provide a guide to experiment design.  相似文献   

16.
For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information.  相似文献   

17.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

18.
The first motivation of this work is to take into account model uncertainty in sensitivity analysis (SA). We present with some examples, a methodology to treat uncertainty due to a mutation of the studied model. Development of this methodology has highlighted an important problem, frequently encountered in SA: how to interpret sensitivity indices when random inputs are non-independent? This paper suggests a strategy for the problem of SA of models with non-independent random inputs. We propose a new application of the multidimensional generalization of classical sensitivity indices, resulting from group sensitivities (sensitivity of the output of the model to a group of inputs), and describe an estimation method based on Monte-Carlo simulations. Practical and theoretical applications illustrate the interest of this method.  相似文献   

19.
Directed site exploration for permeable reactive barrier design   总被引:2,自引:0,他引:2  
Permeable reactive barriers (PRBs) are being employed for in situ site remediation of groundwater that is typically flowing under natural gradients. Site characterization is of critical importance to the success of a PRB. A design-specific site exploration approach called quantitatively directed exploration (QDE) is presented. The QDE approach employs three spatially related matrices: (1) covariance of input parameters, (2) sensitivity of model outputs, and (3) covariance of model outputs to identify the most important location to explore based on a specific design. Sampling at the location that most reduces overall site uncertainty produces a higher probability of success of a particular design. The QDE approach is demonstrated on the Kansas City Plant, Kansas City, MO, a case study where a PRB was installed and failed. It is shown that additional quantitatively directed site exploration during the design phase could have prevented the remedial failure that was caused by missing a geologic body having high hydraulic conductivity at the south end of the barrier. The most contributing input parameter approach using head uncertainty clearly indicated where the next sampling should be made toward the high hydraulic conductivity zone. This case study demonstrates the need to include the specific design as well as site characterization uncertainty when choosing the sampling locations.  相似文献   

20.
An integrated code system SECOM-2, developed at the Japan Atomic Energy Research Institute (JAERI), has the following functions for systems reliability analysis in seismic probabilistic safety assessments (PSAs): (1) calculation of component failure probability, (2) extraction of minimal cut sets (MCSs) from a given fault tree (FT), (3) calculation of frequencies of accident sequences and core damage, (4) importance analysis with several measures with consideration of unique parameters of seismic PSAs, (5) sensitivity analysis, and (6) uncertainty analysis. This paper summarizes the special features of SECOM-2 to perform the analyses mentioned above. At JAERI, using an integrated FT which represents seismically induced core damage due to all initiating events as a system model to calculate core damage frequency of a nuclear power plant, SECOM-2 can calculate conditional point estimate probabilities of system failures, losses of safety functions, and core damage as a function of earthquake motions. The point estimate is computed by a method which gives an exact numerical solution using the Boolean arithmetic model method. As for consideration of correlation of component failure, which has been an important issue in seismic PSAs, a new technique based on direct FT quantification by a Monte Carlo simulation is being added to SECOM-2. Adding this technique, the core damage frequency can be calculated not only with the upper bound approximation based on MCSs but also with a near exact solution taking into account the correlation among all components. This paper also presents the preliminary results of a seismic PSA of a generic BWR plant in Japan performed at JAERI to demonstrate the functions of the SECOM-2 code.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号