首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.  相似文献   

2.
3.
For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δi. It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δi is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.  相似文献   

4.
Simplex-based screening designs for estimating metamodels   总被引:1,自引:0,他引:1  
The screening method proposed by Morris in 1991 allows to identify the important factors of a model, including those involved in interactions. This method, known as the elementary effects method, relies on a “one-factor-at-a-time” (OAT) design of experiments, i.e. two successive points differ only by one factor. In this article, we introduce a non-OAT simplex-based design for the elementary effects method. Its main advantage, compared to Morris's OAT design, is that the sample size does not collapse when the design is projected on sub-spaces spanned by groups of factors. The use of this design to estimate a metamodel depending only on the (screened) important factors is discussed.  相似文献   

5.
This article describes a finite element-based formulation for the statistical analysis of the response of stochastic structural composite systems whose material properties are described by random fields. A first-order technique is used to obtain the second-order statistics for the structural response considering means and variances of the displacement and stress fields of plate or shell composite structures. Propagation of uncertainties depends on sensitivities taken as measurement of variation effects. The adjoint variable method is used to obtain the sensitivity matrix. This method is appropriated for composite structures due to the large number of random input parameters. Dominant effects on the stochastic characteristics are studied analyzing the influence of different random parameters. In particular, a study of the anisotropy influence on uncertainties propagation of angle-ply composites is carried out based on the proposed approach.  相似文献   

6.
This paper, which is an extension of the Suphaphat Kwonpongsagoon’s PhD thesis (2006), investigates a stationary model designed to evaluate substance flows for a case study of cadmium (Cd) in Australia. It covers the mining industry, the production and use of goods for agriculture, construction and households as well as the environmental sectors of agriculture, surface water and landfills. The model is calibrated with Cd flow data obtained in a previous study. The results of the calibrated model are consistent with those of other studies from other countries. Possible measures and options to reduce the Cd flows to various environmental sectors are discussed by applying sensitivity analysis and parameter variations to the calibrated model. As “agriculture” was used to illustrate one of the most important processes discussed in this paper, the results show that the most effective measures are the reduction of the Cd content in fertilizers and of atmospheric Cd deposition. It is concluded that a mathematical model is very useful for understanding a system that is crucial for environmental management.  相似文献   

7.
Variable screening and ranking using sampling-based sensitivity measures   总被引:12,自引:0,他引:12  
This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables.  相似文献   

8.
The effect of distribution type of uncertain inputs on the probabilistic assessment result of a system is illustrated. The tested systems include linear function, positive exponential function, negative exponential function, and reciprocal function, and a proposed corrosion mechanism for radwaste package in addition. The four types of distributions analyzed are uniform (U), log-uniform (LU), normal (N), and log-normal (LN) distributions. Latin hypercube sampling (LHS) was applied to take samples from the uncertain inputs, and the data sets obtained from the said samples were uncorrelatedly arranged before computation. The Fourier amplitude sensitivity test (FAST) was also applied to calculate the sensitivity index of the four distributions. Based on the safety assessment point of view, the results of this paper provide a rationale for the choice of the distribution type between U and LU distributions when the available data points are scarce. The result of the FAST indicates that the sensitivity of the four distributions is, in the order, SU>SLU>SN>SLN. This suggests a need to carefully identify whether the uncertain inputs are of U distribution for the purpose of sensitivity analysis.  相似文献   

9.
This second part describes the application of the methodology for assessing the relative importance of uncertain structural parameters. The emphasis is on the demonstration that the proposed method can indeed handle large-scale problems relevant to industrial users. Four examples are included, with growing complexity and degree of difficulty. While the first two examples are quite simple tutorial type problems, the remaining two examples deal with a large-scale application from aerospace engineering. The results demonstrate the remarkable efficiency of the method, even for problems with extremely high numbers of uncertain parameters.  相似文献   

10.
Sensitivity analysis screening methods aim to isolate the most important factors in experiments involving a large number of significant factors and interactions.This paper extends the one-factor-at-a-time screening method proposed by Morris. The new method, in addition to the ‘overall' sensitivity measures already provided by the traditional Morris method, offers estimates of the two-factor interaction effects. The number of model evaluations required is O(k2), where k is the number of model input factors.The efficient sampling strategy in the parameter space is based on concepts of graph theory and on the solution of the ‘handcuffed prisoner problem'.  相似文献   

11.
A novel procedure for estimating the relative importance of uncertain parameters of complex FE model is presented. The method is specifically directed toward problems involving high-dimensional input parameter spaces, as they are encountered during uncertainty analysis of large scale, refined FE models. In these cases one is commonly faced with thousands of uncertain parameters and traditional techniques, e.g. finite difference or direct differentiation methods become expensive. In contrast, the presented method quickly filters out the most influential variables. Hence, the main objective is not to compute the sensitivity but to identify those parameters whose random variations have the biggest influence on the response. This is achieved by generating a set of samples with direct Monte Carlo simulation, which are closely scattered around the point at which the relative importance measures are sought. From these samples, estimators of the relative importance are synthesized and the most important ones are refined with a method of choice. In this paper, the underlying theory as well as the resulting algorithm is presented.  相似文献   

12.
A cumulative distribution function (CDF)-based method has been used to perform sensitivity analysis on a computer model that conducts total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, and to identify the most influential input parameters affecting the output of the model. The performance assessment computer model referred to as the TPA code, was recently developed by the US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA), to evaluate the performance assessments conducted by the US department of energy (DOE) in support of their license application. The model uses a probabilistic framework implemented through Monte Carlo or Latin hypercube sampling (LHS) to permit the propagation of uncertainties associated with model parameters, conceptual models, and future system states. The problem involves more than 246 uncertain parameters (also referred to as random variables) of which the ones that have significant influence on the response or the uncertainty of the response must be identified and ranked. The CDF-based approach identifies and ranks important parameters based on the sensitivity of the response CDF to the input parameter distributions. Based on a reliability sensitivity concept [AIAA Journal 32 (1994) 1717], the response CDF is defined as the integral of the joint probability-density-function of the input parameters, with a domain of integration that is defined by a subset of the samples. The sensitivity analysis does not require explicit knowledge of any specific relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The method allows for calculating sensitivity over a wide range of the response and is not limited to the mean value.  相似文献   

13.
There are difficulties with probability as a representation of uncertainty. However, we argue that there is an important distinction between principle and practice. In principle, probability is uniquely appropriate for the representation and quantification of all forms of uncertainty; it is in this sense that we claim that ‘probability is perfect’. In practice, people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distributions is a far from perfect process. We therefore argue that there is no need for alternative theories, but that any practical elicitation of expert knowledge must fully acknowledge imprecision in the resulting distribution.We outline a recently developed Bayesian technique that allows the imprecision in elicitation to be formulated explicitly, and apply it to some of the challenge problems.  相似文献   

14.
The New Morris Method was proposed by Campolongo and Braddock [Reliab. Engng Syst. Saf. 64 (1999) 1] as an extension of the Morris Method [Technometrics 33 (1991) 161] to include estimation of two-factor interaction effects. An undetected programming error prevented Campolongo and Braddock from appreciating the efficacy of the method. Testing on an analytic function reveals that the method is more powerful and efficient than previously thought.  相似文献   

15.
An overview is first given of current biomechanical problems resulting from inadequate models of the human skeletal, muscular, and neural subsystem. The application areas most affected by these deficiencies are clinical gait and general motion analysis, orthopaedics, skeletal muscle research, and all disciplines requiring computer simulation responses of human body models under various conditions. Fundamentals of neuromyoskeletal systems modeling techniques are then discussed and a scheme of how to successfully create and validate a complex model is suggested. Finally, a short selection of currently used large-scale neuromyoskeletal models and their features are presented together with anticipated future directions in large-scale neuromyoskeletal modeling.  相似文献   

16.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

17.
An important problem in the analysis of computer experiments is the specification of the uncertainty of the prediction according to a meta-model. The Bayesian approach, developed for the uncertainty analysis of deterministic computer models, expresses uncertainty by the use of a Gaussian process. There are several versions of the Bayesian approach, which are different in many regards but all of them lead to time consuming computations for large data sets.In the present paper we introduce a new approach in which the distribution of uncertainty is obtained in a general nonparametric form. The proposed approach is called non-parametric uncertainty analysis (NPUA), which is computationally simple since it combines generic sampling and regression techniques. We compare NPUA with the Bayesian and Kriging approaches and show the advantages of NPUA for finding points for the next runs by reanalyzing the ASET model.  相似文献   

18.
A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution.  相似文献   

19.
In this article, we apply supersaturated strategies to identify active factors in the field of numerical simulation. As the number of simulations is inferior to the number of studied factors, new resolution techniques are proposed and provide excellent results. Indeed, classical supersaturated treatments could not be applied to the high dimensional problem.We show that these designs are well adapted to the high dimensional problem while maintaining the number of simulations at reasonable levels. Very small-sized supersaturated designs show low reliability.  相似文献   

20.
Software reliability assessment models in use today treat software as a monolithic block. An aversion towards ‘atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号