首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δi. It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δi is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.  相似文献   

2.
A new uncertainty importance measure   总被引:19,自引:0,他引:19  
Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22–33] first introduced uncertainty importance measures.  相似文献   

3.
Distribution Envelope Determination (DEnv) is a method for computing the CDFs of random variables whose samples are a function of samples of other random variable(s), termed inputs. DEnv computes envelopes around these CDFs when there is uncertainty about the precise form of the probability distribution describing any input. For example, inputs whose distribution functions have means and variances known only to within intervals can be handled. More generally, inputs can be handled if the set of all plausible cumulative distributions describing each input can be enclosed between left and right envelopes. Results will typically be in the form of envelopes when inputs are envelopes, when the dependency relationship of the inputs is unspecified, or both. For example in the case of specific input distribution functions with unspecified dependency relationships, each of the infinite number of possible dependency relationships would imply some specific output distribution, and the set of all such output distributions can be bounded with envelopes. The DEnv algorithm is a way to obtain the bounding envelopes. DEnv is implemented in a tool which is used to solve problems from a benchmark set.  相似文献   

4.
Analysis of truncation limit in probabilistic safety assessment   总被引:3,自引:4,他引:3  
A truncation limit defines the boundaries of what is considered in the probabilistic safety assessment and what is neglected. The truncation limit that is the focus here is the truncation limit on the size of the minimal cut set contribution at which to cut off. A new method was developed, which defines truncation limit in probabilistic safety assessment. The method specifies truncation limits with more stringency than presenting existing documents dealing with truncation criteria in probabilistic safety assessment do. The results of this paper indicate that the truncation limits for more complex probabilistic safety assessments, which consist of larger number of basic events, should be more severe than presently recommended in existing documents if more accuracy is desired. The truncation limits defined by the new method reduce the relative errors of importance measures and produce more accurate results for probabilistic safety assessment applications. The reduced relative errors of importance measures can prevent situations, where the acceptability of change of equipment under investigation according to RG 1.174 would be shifted from region, where changes can be accepted, to region, where changes cannot be accepted, if the results would be calculated with smaller truncation limit.  相似文献   

5.
In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction.The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics.  相似文献   

6.
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSIC). Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify their impact on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a Monte Carlo double-loop, requires a large number of model evaluations, which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a Monte Carlo single-loop with a limited calculation budget. First, we build a unique sample of inputs and simulator outputs, from a well-chosen probability distribution of inputs. From this sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Subsequently, we define 2nd-level HSIC-based measures between the distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.  相似文献   

7.
An uncertainty-based sensitivity index represents the contribution that uncertainty in model input Xi makes to the uncertainty in model output Y. This paper addresses the situation where the uncertainties in the model inputs are expressed as closed convex sets of probability measures, a situation that exists when inputs are expressed as intervals or sets of intervals with no particular distribution specified over the intervals, or as probability distributions with interval-valued parameters. Three different approaches to measuring uncertainty, and hence uncertainty-based sensitivity, are explored. Variance-based sensitivity analysis (VBSA) estimates the contribution that each uncertain input, acting individually or in combination, makes to variance in the model output. The partial expected value of perfect information (partial EVPI), quantifies the (financial) value of learning the true numeric value of an input. For both of these sensitivity indices the generalization to closed convex sets of probability measures yields lower and upper sensitivity indices. Finally, the use of relative entropy as an uncertainty-based sensitivity index is introduced and extended to the imprecise setting, drawing upon recent work on entropy measures for imprecise information.  相似文献   

8.
We conjecture that when the uncertainty of scheduling information increases, one should change from deterministic, through robust to, finally, online scheduling techniques. Previously, extensive mathematical investigations have been carried out on the stability of a deterministic schedule for uncertain operation processing times. In this paper, we will use an empirical approach and an entropy measure to justify the transition between deterministic, robust and online scheduling. The use of an entropy measure in our context can be perceived, in a broader sense, as a pro-active approach to deal with changes in the level of information uncertainty and relative importance of each term in the total schedule execution cost. The level of information uncertainty may change due to the performance deterioration of processors (machines or human) and the replacement of old machines with new ones; and the changes in relative importance of cost elements may be due to changes in shop floor priorities and pressure from partners in the supply chain network. One can decide upon the scheduling strategies to be employed based on the latest entropy value of the information considered and the relative importance of each cost term.  相似文献   

9.
The Epistemic Uncertainty Project of Sandia National Laboratories (NM, USA) proposed two challenge problems intended to assess the applicability and the relevant merits of modern mathematical theories of uncertainty in reliability engineering and risk analysis. This paper proposes a solution to Problem B: the response of a mechanical system with uncertain parameters. Random Set Theory is used to cope with both imprecision and dissonance affecting the available information. Imprecision results in an envelope of CDFs of the system response bounded by an upper CDF and a lower CDF. Different types of parameter discretizations are introduced. It is shown that: (i) when the system response presents extrema in the range of parameters considered, it is better to increase the fineness of the discretization than to invoke a global optimization tool; (ii) the response expectation differed by less than 0.5% when the number of function calls was increased 15.7 times; (iii) larger differences (4–5%) were obtained for the lower tails of the CDFs of the response. Further research is necessary to investigate (i) parameter discretizations aimed at increasing the accuracy of the CDFs (lower) tails; (ii) the role of correlation in combining information.  相似文献   

10.
In this article, the problem of choosing from a set of design alternatives based upon multiple, conflicting, and uncertain criteria is investigated. The problem of selection over multiple attributes becomes harder when risky alternatives exist. The overlap measure method developed in this article models two sources of uncertainties—imprecise or risky attribute values provided to the decision maker and inabilities of the decision-maker to specify an exact desirable attribute value. Effects of these uncertainties are mitigated using the overlap measure metric. A subroutine to this method, called the robust alternative selection method, ensures that the winning alternative is insensitive to changes in the relative importance of the different design attributes. The overlap measure method can be used to model and handle various sources of uncertainties and can be applied to any number of multiattribute decision-making methods. In this article, it is applied to the hypothetical equivalents and inequivalents method, which is a multiattribute selection method under certainty.  相似文献   

11.
As a novel type of polynomial chaos expansion (PCE), the data-driven PCE (DD-PCE) approach has been developed to have a wide range of potential applications for uncertainty propagation. While the research on DD-PCE is still ongoing, its merits compared with the existing PCE approaches have yet to be understood and explored, and its limitations also need to be addressed. In this article, the Galerkin projection technique in conjunction with the moment-matching equations is employed in DD-PCE for higher-dimensional uncertainty propagation. The enhanced DD-PCE method is then compared with current PCE methods to fully investigate its relative merits through four numerical examples considering different cases of information for random inputs. It is found that the proposed method could improve the accuracy, or in some cases leads to comparable results, demonstrating its effectiveness and advantages. Its application in dealing with a Mars entry trajectory optimization problem further verifies its effectiveness.  相似文献   

12.
Graphite isotope ratio method (GIRM) is a technique that uses measurements and computer models to estimate total plutonium (Pu) production in a graphite-moderated reactor. First, isotopic ratios of trace elements in extracted graphite samples from the target reactor are measured. Then, computer models of the reactor relate those ratios to Pu production. Because Pu is controlled under non-proliferation agreements, an estimate of total Pu production is often required, and a declaration of total Pu might need to be verified through GIRM. In some cases, reactor information (such as core dimensions, coolant details, and operating history) are so well documented that computer models can predict total Pu production without the need for measurements. However, in most cases, reactor information is imperfectly known, so a measurement and model-based method such as GIRM is essential. Here, we focus on GIRM's estimation procedure and its associated uncertainty. We illustrate a simulation strategy for a specific reactor that estimates GIRM's uncertainty and determines which inputs contribute most to GIRM's uncertainty, including inputs to the computer models. These models include a “local” code that relates isotopic ratios to the local Pu production, and a “global” code that predicts the Pu production shape over the entire reactor. This predicted shape is included with other 3D basis functions to provide a “hybrid basis set” that is used to fit the local Pu production estimates. The fitted shape can then be integrated over the entire reactor to estimate total Pu production. This GIRM evaluation provides a good example of several techniques of uncertainty analysis and introduces new reasons to fit a function using basis functions in the evaluation of the impact of uncertainty in the true 3D shape.  相似文献   

13.
Uncertainty and sensitivity analysis for models with correlated parameters   总被引:2,自引:0,他引:2  
When conducting sensitivity and uncertainty analysis, most of the global sensitivity techniques assume parameter independence. However, it is common that the parameters are correlated with each other. For models with correlated inputs, we propose that the contribution of uncertainty to model output by an individual parameter be divided into two parts: the correlated contribution (by the correlated variations, i.e. variations of a parameter which are correlated with other parameters) and the uncorrelated contribution (by the uncorrelated variations, i.e. the unique variations of a parameter which cannot be explained by any other parameters). So far, only a few studies have been conducted to obtain the sensitivity index for a model with correlated input. But these studies do not distinguish between the correlated and uncorrelated contribution of a parameter. In this study, we propose a regression-based method to quantitatively decompose the total uncertainty in model output into partial variances contributed by the correlated variations and partial variances contributed by the uncorrelated variations. The proposed regression-based method is then applied in three test cases. Results show that the regression-based method can successfully measure the uncertainty contribution in the case where the relationship between response and parameters is approximately linear.  相似文献   

14.
This paper develops a Bayesian methodology for assessing the confidence in model prediction by comparing the model output with experimental data when both are stochastic. The prior distribution of the response is first computed, which is then updated based on experimental observation using Bayesian analysis to compute a validation metric. A model error estimation methodology is then developed to include model form error, discretization error, stochastic analysis error (UQ error), input data error and output measurement error. Sensitivity of the validation metric to various error components and model parameters is discussed. A numerical example is presented to illustrate the proposed methodology.  相似文献   

15.
利用蒙特卡罗方法对多变量的冲击力溯源系统进行不确定性量化评定。该方法首先建立冲击力溯源系统各不确定变量的表征模型;再根据抽样空间相邻迭代解的误差限,建立样本量的适应度函数;最后利用适应度函数使冲击力测量模型的输入变量最佳逼近总体分布。为了验证该方法的有效性,利用该方法对冲击力溯源系统进行不确定度评定,评定结果表明,在95%的置信水平上,冲击力溯源系统的相对扩展不确定度优于0.818%,其不确定性主要来源于落锤上表面加速度分布不均匀和横向偏摆。  相似文献   

16.
For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information.  相似文献   

17.
Generalizing the safety factor approach   总被引:2,自引:0,他引:2  
Safety factors (uncertainty factors) are used to avoid failure in a wide variety of practices and disciplines, in particular engineering design and toxicology. Although these two areas have similar problems in their use of safety factors, there are no signs of previous communication between the two disciplines.The present contribution aims at initiating such communications by pointing out parallel practices and joint issues between the two disciplines. These include the distinction between probabilistic variability and epistemic uncertainty, the importance of distribution tails, and the problem of countervailing risks. In conclusion, it is proposed that future research in this area should be interdisciplinary and make use of experiences from the various areas in which safety factors are used.  相似文献   

18.
This work presents a framework for predicting the unknown probability distributions of input parameters, starting from scarce experimental measurements of other input parameters and the Quantity of Interest (QoI), as well as a computational model of the system. This problem is relevant to aeronautics, an example being the calculation of the material properties of carbon fibre composites, which are often inferred from experimental measurements of the full-field response. The method presented here builds a probability distribution for the missing inputs with an approach based on probabilistic equivalence. The missing inputs are represented with a multi-modal Polynomial Chaos Expansion (mmPCE), a formulation which enables the algorithm to efficiently handle multi-modal experimental data. The parameters of the mmPCE are found through an optimisation process. The mmPCE is used to produce a dataset for the missing inputs, the input uncertainties are then propagated through the computational model of the system using arbitrary Polynomial Chaos (aPC) in order to produce a probability distribution for the QoI. This is in addition to an estimate of the QoI’s probability distribution arising from experimental measurements. The coefficients of the mmPCE are adjusted such that the statistical distance between the two estimates of the probability distribution of the QoI is minimised. The algorithm has two key aspects: the metric used to quantify the statistical distance between distributions and the aPC formulation used to propagate the input uncertainties. In this work the Kolmogorov–Smirnov (KS) distance was used to quantify the distance between probability distributions for the QoI as it allowed high order statistical moments to be matched and is non-parametric.The framework for back-calculating unknown input distributions was demonstrated using a dataset comprising scarce experimental measurements of the material properties of a batch of carbon fibre coupons. The ability of the algorithm to back-calculate a distribution for the shear and compression strength of the composite, based on limited experimental data, was demonstrated. It was found that it was possible to recover reasonably accurate probability distributions for the missing material properties, even when an extremely scarce data set with a fairly simplistic computational model was used.  相似文献   

19.
Imaging has occupied a huge role in the management of patients, whether hospitalized or not. Depending on the patient's clinical problem, a variety of imaging modalities were available for use. Radiology is the branch of medical science dealing with medical imaging. It may use X‐ray machines or other such radiation devices. It also uses techniques that do not involve radiation, such as magnetic resonance imaging (MRI) and ultrasound (US). Commonly used imaging modalities include plain radiography, computed tomography (CT), MRI, US, and nuclear imaging techniques. Each of these modalities has strengths and limitations which dictates its use in diagnosis. The usage of modality for a particular problem must be reviewed with emphasis on method of generating an image with costs, strengths and weaknesses, and associated risks. The reason for image retrieval is due to increase in acquisition of images. Physicians and radiologists feel better while using retrieval techniques for faster remedy in surgery and medicine due to the following reasons: giving details to the patients, searching the present and past records from the larger databases, and giving solutions to them in a faster and more accurate way. Similarity measures are one of the techniques that help us in retrieval of medical images. Similarity measures also termed as distance metrics, which plays an important role in CBIR and CBMIR. They calculate the visual similarities between the query image and images in the database which were ranked by their similarities with the query image. Different similarity measures have different effects in an image retrieval system significantly. So, it is important to find the best distance metrics for CBIR system. In this article, various distance methods were used and then they are compared for effective medical image retrieval. A double‐step approach is followed for effective retrieval. This article describes some easily computable distance measures for medical image retrieval using measures such as probability, mean, standard deviation, skew, energy, and entropy. The distance measures used are Euclidean, Manhattan, Mahalanobis, Canberra, Bray‐Curtis, squared chord, and Squared chi‐squared. Two kind of decision rules precision and accuracy were used for measuring retrieval. A dataset is created using various imaging modalities like CT, MRI, and US images. From the final results, it is very clear that each distance metric with each measures shows different results in retrieval of medical images. It is found that the distance metrics with all the measures shows different precision and recall value calculated from their retrieved medical images. The best retrieval results for Euclidean distance metric is only with probability measure showing 75% of precision and 30% of recall when comparing with other measures. The best retrieval results for Manhattan distance metric is only with mean as a measure giving 50% of precision and 20% of recall when compared its performance with other measures in the retrieval of medical images. The best retrieval results for Mahalanobis distance metric is only with probability as a measure giving 75% of precision and 30% of recall when compared its performance with other measures in the retrieval of medical images. The best retrieval results for Canberra distance metric is only with mean as a measure giving 50% of precision and 20% of recall when compared its performance with other measures in the retrieval of medical images. The best retrieval results for Bray‐Curtis distance metric is only with mean as a measure giving 50% of precision and 20% of recall when compared its performance with other measures in the retrieval of medical images. The best retrieval results for squared‐chord distance metric is only with mean as a measure giving 50% of precision and 20% of recall when compared its performance with other measures in the retrieval of medical images. The best retrieval results for squared chi‐chord distance metric is only with mean as a measure showing 50% of precision and 20% of recall when compared its performance with other measures in the retrieval of medical images. These results indicate that these easily computable similarity distance measures have a wide variety of medical image retrieval applications. © 2013 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 23, 9–21, 2013  相似文献   

20.
This paper presents results from research aimed at developing a real-time, ‘look ahead’ reliability measure of risk relative to tool wear or breakage. The objective is to present results of a successful research effort which has developed a real-time based conditional reliability (of individual tool survival) output from two inputs; (1) real-time thrust data from an individual tool, and (2) an expected thrust performance population model. The resulting conditional tool reliability measure developed is compatible with real-time tool management. It is used to estimate the conditional probability of toot survival in a drilling operation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号