首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
2.
Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.  相似文献   

3.
In this paper we propose and test a generalisation of the method originally proposed by Sobol’, and recently extended by Saltelli, to estimate the first-order and total effect sensitivity indices. Exploiting the symmetries and the dualities of the formulas, we obtain additional estimates of first-order and total indices at no extra computational cost. We test the technique on a case study involving the construction of a composite indicator of e-business readiness, which is part of the initiative “e-Readiness of European enterprises” of the European Commission “e-Europe 2005” action plan. The method is used to assess the contribution of uncertainties in (a) the weights of the component indicators and (b) the imputation of missing data on the composite indicator values for several European countries.  相似文献   

4.
For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δi. It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δi is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.  相似文献   

5.
Simplex-based screening designs for estimating metamodels   总被引:1,自引:0,他引:1  
The screening method proposed by Morris in 1991 allows to identify the important factors of a model, including those involved in interactions. This method, known as the elementary effects method, relies on a “one-factor-at-a-time” (OAT) design of experiments, i.e. two successive points differ only by one factor. In this article, we introduce a non-OAT simplex-based design for the elementary effects method. Its main advantage, compared to Morris's OAT design, is that the sample size does not collapse when the design is projected on sub-spaces spanned by groups of factors. The use of this design to estimate a metamodel depending only on the (screened) important factors is discussed.  相似文献   

6.
A parametric sensitivity analysis is carried out on GASCON, a radiological impact software describing the radionuclides transfer to the man following a chronic gas release of a nuclear facility. An effective dose received by age group can thus be calculated according to a specific radionuclide and to the duration of the release. In this study, we are concerned by 18 output variables, each depending of approximately 50 uncertain input parameters. First, the generation of 1000 Monte-Carlo simulations allows us to calculate correlation coefficients between input parameters and output variables, which give a first overview of important factors. Response surfaces are then constructed in polynomial form, and used to predict system responses at reduced computation time cost; this response surface will be very useful for global sensitivity analysis where thousands of runs are required. Using the response surfaces, we calculate the total sensitivity indices of Sobol by the Monte-Carlo method. We demonstrate the application of this method to one site of study and to one reference group near the nuclear research Center of Cadarache (France), for two radionuclides: iodine 129 and uranium 238. It is thus shown that the most influential parameters are all related to the food chain of the goat's milk, in decreasing order of importance: dose coefficient “effective ingestion”, goat's milk ration of the individuals of the reference group, grass ration of the goat, dry deposition velocity and transfer factor to the goat's milk.  相似文献   

7.
This article describes a finite element-based formulation for the statistical analysis of the response of stochastic structural composite systems whose material properties are described by random fields. A first-order technique is used to obtain the second-order statistics for the structural response considering means and variances of the displacement and stress fields of plate or shell composite structures. Propagation of uncertainties depends on sensitivities taken as measurement of variation effects. The adjoint variable method is used to obtain the sensitivity matrix. This method is appropriated for composite structures due to the large number of random input parameters. Dominant effects on the stochastic characteristics are studied analyzing the influence of different random parameters. In particular, a study of the anisotropy influence on uncertainties propagation of angle-ply composites is carried out based on the proposed approach.  相似文献   

8.
This paper, which is an extension of the Suphaphat Kwonpongsagoon’s PhD thesis (2006), investigates a stationary model designed to evaluate substance flows for a case study of cadmium (Cd) in Australia. It covers the mining industry, the production and use of goods for agriculture, construction and households as well as the environmental sectors of agriculture, surface water and landfills. The model is calibrated with Cd flow data obtained in a previous study. The results of the calibrated model are consistent with those of other studies from other countries. Possible measures and options to reduce the Cd flows to various environmental sectors are discussed by applying sensitivity analysis and parameter variations to the calibrated model. As “agriculture” was used to illustrate one of the most important processes discussed in this paper, the results show that the most effective measures are the reduction of the Cd content in fertilizers and of atmospheric Cd deposition. It is concluded that a mathematical model is very useful for understanding a system that is crucial for environmental management.  相似文献   

9.
Variable screening and ranking using sampling-based sensitivity measures   总被引:12,自引:0,他引:12  
This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables.  相似文献   

10.
The effect of distribution type of uncertain inputs on the probabilistic assessment result of a system is illustrated. The tested systems include linear function, positive exponential function, negative exponential function, and reciprocal function, and a proposed corrosion mechanism for radwaste package in addition. The four types of distributions analyzed are uniform (U), log-uniform (LU), normal (N), and log-normal (LN) distributions. Latin hypercube sampling (LHS) was applied to take samples from the uncertain inputs, and the data sets obtained from the said samples were uncorrelatedly arranged before computation. The Fourier amplitude sensitivity test (FAST) was also applied to calculate the sensitivity index of the four distributions. Based on the safety assessment point of view, the results of this paper provide a rationale for the choice of the distribution type between U and LU distributions when the available data points are scarce. The result of the FAST indicates that the sensitivity of the four distributions is, in the order, SU>SLU>SN>SLN. This suggests a need to carefully identify whether the uncertain inputs are of U distribution for the purpose of sensitivity analysis.  相似文献   

11.
This second part describes the application of the methodology for assessing the relative importance of uncertain structural parameters. The emphasis is on the demonstration that the proposed method can indeed handle large-scale problems relevant to industrial users. Four examples are included, with growing complexity and degree of difficulty. While the first two examples are quite simple tutorial type problems, the remaining two examples deal with a large-scale application from aerospace engineering. The results demonstrate the remarkable efficiency of the method, even for problems with extremely high numbers of uncertain parameters.  相似文献   

12.
Sensitivity analysis screening methods aim to isolate the most important factors in experiments involving a large number of significant factors and interactions.This paper extends the one-factor-at-a-time screening method proposed by Morris. The new method, in addition to the ‘overall' sensitivity measures already provided by the traditional Morris method, offers estimates of the two-factor interaction effects. The number of model evaluations required is O(k2), where k is the number of model input factors.The efficient sampling strategy in the parameter space is based on concepts of graph theory and on the solution of the ‘handcuffed prisoner problem'.  相似文献   

13.
A novel procedure for estimating the relative importance of uncertain parameters of complex FE model is presented. The method is specifically directed toward problems involving high-dimensional input parameter spaces, as they are encountered during uncertainty analysis of large scale, refined FE models. In these cases one is commonly faced with thousands of uncertain parameters and traditional techniques, e.g. finite difference or direct differentiation methods become expensive. In contrast, the presented method quickly filters out the most influential variables. Hence, the main objective is not to compute the sensitivity but to identify those parameters whose random variations have the biggest influence on the response. This is achieved by generating a set of samples with direct Monte Carlo simulation, which are closely scattered around the point at which the relative importance measures are sought. From these samples, estimators of the relative importance are synthesized and the most important ones are refined with a method of choice. In this paper, the underlying theory as well as the resulting algorithm is presented.  相似文献   

14.
A cumulative distribution function (CDF)-based method has been used to perform sensitivity analysis on a computer model that conducts total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, and to identify the most influential input parameters affecting the output of the model. The performance assessment computer model referred to as the TPA code, was recently developed by the US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA), to evaluate the performance assessments conducted by the US department of energy (DOE) in support of their license application. The model uses a probabilistic framework implemented through Monte Carlo or Latin hypercube sampling (LHS) to permit the propagation of uncertainties associated with model parameters, conceptual models, and future system states. The problem involves more than 246 uncertain parameters (also referred to as random variables) of which the ones that have significant influence on the response or the uncertainty of the response must be identified and ranked. The CDF-based approach identifies and ranks important parameters based on the sensitivity of the response CDF to the input parameter distributions. Based on a reliability sensitivity concept [AIAA Journal 32 (1994) 1717], the response CDF is defined as the integral of the joint probability-density-function of the input parameters, with a domain of integration that is defined by a subset of the samples. The sensitivity analysis does not require explicit knowledge of any specific relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The method allows for calculating sensitivity over a wide range of the response and is not limited to the mean value.  相似文献   

15.
There are difficulties with probability as a representation of uncertainty. However, we argue that there is an important distinction between principle and practice. In principle, probability is uniquely appropriate for the representation and quantification of all forms of uncertainty; it is in this sense that we claim that ‘probability is perfect’. In practice, people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distributions is a far from perfect process. We therefore argue that there is no need for alternative theories, but that any practical elicitation of expert knowledge must fully acknowledge imprecision in the resulting distribution.We outline a recently developed Bayesian technique that allows the imprecision in elicitation to be formulated explicitly, and apply it to some of the challenge problems.  相似文献   

16.
The New Morris Method was proposed by Campolongo and Braddock [Reliab. Engng Syst. Saf. 64 (1999) 1] as an extension of the Morris Method [Technometrics 33 (1991) 161] to include estimation of two-factor interaction effects. An undetected programming error prevented Campolongo and Braddock from appreciating the efficacy of the method. Testing on an analytic function reveals that the method is more powerful and efficient than previously thought.  相似文献   

17.
An overview is first given of current biomechanical problems resulting from inadequate models of the human skeletal, muscular, and neural subsystem. The application areas most affected by these deficiencies are clinical gait and general motion analysis, orthopaedics, skeletal muscle research, and all disciplines requiring computer simulation responses of human body models under various conditions. Fundamentals of neuromyoskeletal systems modeling techniques are then discussed and a scheme of how to successfully create and validate a complex model is suggested. Finally, a short selection of currently used large-scale neuromyoskeletal models and their features are presented together with anticipated future directions in large-scale neuromyoskeletal modeling.  相似文献   

18.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

19.
ABSTRACT

Planning is important in the process of engineering reliability and growth. Planning models and the related curves are always influenced by their intrinsic parameters. However, there is little research on the relationship between planning models and their inherent parameters. This study examines the relationships between the planning model based on projection methodology (PM2 model) and its four inherent parameters. The results have shown that all four parameters have a significant influence on the reliability growth-planning curve. Moreover, they reveal that some parameters have minimum limits. By prejudging the parameters of the model, the risk of not achieving the test target can be reduced. This study will provide guidance for future engineering planning and management.  相似文献   

20.
An important problem in the analysis of computer experiments is the specification of the uncertainty of the prediction according to a meta-model. The Bayesian approach, developed for the uncertainty analysis of deterministic computer models, expresses uncertainty by the use of a Gaussian process. There are several versions of the Bayesian approach, which are different in many regards but all of them lead to time consuming computations for large data sets.In the present paper we introduce a new approach in which the distribution of uncertainty is obtained in a general nonparametric form. The proposed approach is called non-parametric uncertainty analysis (NPUA), which is computationally simple since it combines generic sampling and regression techniques. We compare NPUA with the Bayesian and Kriging approaches and show the advantages of NPUA for finding points for the next runs by reanalyzing the ASET model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号