首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 12 毫秒
1.
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000 yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of subjective uncertainty is discussed, including assignment of distributions, uncertain variables selected for inclusion in analysis, correlation control, sample size, statistical confidence on mean complementary cumulative distribution functions, generation of Latin hypercube samples, sensitivity analysis techniques, and scenarios involving stochastic and subjective uncertainty.  相似文献   

2.
Uncertainty and sensitivity analysis results obtained with random and Latin hypercube sampling are compared. The comparison uses results from a model for two-phase fluid flow obtained with three independent random samples of size 100 each and three independent Latin hypercube samples (LHSs) of size 100 each. Uncertainty and sensitivity analysis results with the two sampling procedures are similar and stable across the three replicated samples. Poor performance of regression-based sensitivity analysis procedures for some analysis outcomes results more from the inappropriateness of the procedure for the nonlinear relationships between model input and model results than from an inadequate sample size. Kendall's coefficient of concordance (KCC) and the top down coefficient of concordance (TDCC) are used to assess the stability of sensitivity analysis results across replicated samples, with the TDCC providing a more informative measure of analysis stability than KCC. A new sensitivity analysis procedure based on replicated samples and the TDCC is introduced.  相似文献   

3.
Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis were used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The following results were obtained in tests to check the robustness of the analysis techniques: two independent Latin hypercube samples produced similar uncertainty and sensitivity analysis results; setting important variables to best-estimate values produced substantial reductions in uncertainty, while setting the less important variables to best-estimate values had little effect on uncertainty; similar sensitivity analysis results were obtained when the original uniform and loguniform distributions assigned to the 34 imprecisely known input variables were changed to left-triangular distributions and then to right-triangular distributions; and analyses with rank-transformed and logarithmically-transformed data produced similar results and substantially outperformed analyses with raw (i.e., untransformed) data.  相似文献   

4.
A cumulative distribution function (CDF)-based method has been used to perform sensitivity analysis on a computer model that conducts total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, and to identify the most influential input parameters affecting the output of the model. The performance assessment computer model referred to as the TPA code, was recently developed by the US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA), to evaluate the performance assessments conducted by the US department of energy (DOE) in support of their license application. The model uses a probabilistic framework implemented through Monte Carlo or Latin hypercube sampling (LHS) to permit the propagation of uncertainties associated with model parameters, conceptual models, and future system states. The problem involves more than 246 uncertain parameters (also referred to as random variables) of which the ones that have significant influence on the response or the uncertainty of the response must be identified and ranked. The CDF-based approach identifies and ranks important parameters based on the sensitivity of the response CDF to the input parameter distributions. Based on a reliability sensitivity concept [AIAA Journal 32 (1994) 1717], the response CDF is defined as the integral of the joint probability-density-function of the input parameters, with a domain of integration that is defined by a subset of the samples. The sensitivity analysis does not require explicit knowledge of any specific relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The method allows for calculating sensitivity over a wide range of the response and is not limited to the mean value.  相似文献   

5.
Local and global uncertainty analysis of complex chemical kinetic systems   总被引:3,自引:0,他引:3  
Computer modelling plays a crucial part in the understanding of complex chemical reactions. Parameters of elementary chemical and physical processes are usually determined in independent experiments and are always associated with uncertainties. Two typical examples of complex chemical kinetic systems are the combustion of gases and the photochemical processes in the atmosphere. In this study, local uncertainty analysis, the Morris method, and Monte Carlo analysis with Latin hypercube sampling were applied to an atmospheric and to a combustion model. These models had 45 and 37 variables along with 141 and 212 uncertain parameters, respectively. The toolkit used here consists of complementary methods and is able to map both the sources and the magnitudes of uncertainties. In the case of the combustion model, the global uncertainties of the local sensitivity coefficients were also investigated, and the order of parameter importance based on local sensitivities were found to be almost independent of the parameter values within their range of uncertainty.  相似文献   

6.
7.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

8.
Probabilistic sensitivities provide an important insight in reliability analysis and often crucial towards understanding the physical behaviour underlying failure and modifying the design to mitigate and manage risk. This article presents a new computational approach for calculating stochastic sensitivities of mechanical systems with respect to distribution parameters of random variables. The method involves high dimensional model representation and score functions associated with probability distribution of a random input. The proposed approach facilitates first-and second-order approximation of stochastic sensitivity measures and statistical simulation. The formulation is general such that any simulation method can be used for the computation such as Monte Carlo, importance sampling, Latin hypercube, etc. Both the probabilistic response and its sensitivities can be estimated from a single probabilistic analysis, without requiring gradients of performance function. Numerical results indicate that the proposed method provides accurate and computationally efficient estimates of sensitivities of statistical moments or reliability of structural system.  相似文献   

9.
研究包装件参数不确定性对振动可靠性变化的影响,并分析振动可靠性指标对各不确定参数的灵敏度.采用Karhunen-Loeve展开将具有一定谱特征的平稳随机振动表示在标准正态随机变量空间中,应用一阶可靠性方法分析线性包装件振动可靠性指标.考虑缓冲材料弹性特性、阻尼特性、产品主体和脆弱部件之间的弹性特性、阻尼特性四个随机参数...  相似文献   

10.
基于蒙特卡罗的考虑随机初始缺陷的分析方法   总被引:1,自引:0,他引:1  
现行规范采用的多种直接或间接考虑初始缺陷的取值方法均没有很好的考虑构件缺陷的随机分布,夸大了初始几何缺陷的影响。针对现有工程初始缺陷实测分布具有随机偶然性的问题,该文采用蒙特卡罗法模拟了结构构件的随机初始几何缺陷。为进一步提高抽样效率,同时采用了具有抽样记忆功能的拉丁超立方抽样技术,避免了在抽样空间内重复抽样,有效减少了模拟次数。结果表明:用自编程序实现的考虑钢框架结构随机初始几何缺陷的分析方法,可以在多高层钢框架结构高等分析中模拟初始缺陷的随机分布与遇合,为开展准确有效的高等分析与设计研究奠定基础。  相似文献   

11.
12.
Z. Li  L. B. Duan  T. Chen  W. Yao 《工程优选》2019,51(8):1393-1411
Design optimization plays an important role in electric vehicle (EV) design. However, fluctuations in design variables and noise factors during the forming process affect the stability of optimization results. This study uses six-sigma robust design optimization to explore the lightweight design and crashworthiness of EVs with uncertainty. A full-scale finite element model of an EV is established. Then, multi-objective design optimization is performed by integrating optimal Latin hypercube sampling, radial basis functions and non-dominated sorting genetic algorithm-II to achieve minimum peak acceleration and mass. Finally, six-sigma robust optimization designs are applied to improve the reliability and sigma level. Robust optimization using adaptive importance sampling is shown to be more efficient than that using Monte Carlo sampling. Moreover, deformation of the battery compartment and peak acceleration of the B-pillar are greatly decreased. The EV’s safety performance is improved and the lightweight effect is remarkable, validating the strong engineering practicability of the method.  相似文献   

13.
Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow in the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation, repository pressure, brine saturation, and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to which the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.  相似文献   

14.
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000-yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of stochastic uncertainty is discussed, including drilling intrusion time, drilling location, penetration of excavated/nonexcavated areas of the repository, penetration of pressurized brine beneath the repository, borehole plugging patterns, activity level of waste, and occurrence of potash mining. Additional topics discussed include sampling procedures, generation of individual 10,000-yr futures for the WIPP, construction of complementary cumulative distribution functions (CCDFs), mechanistic calculations carried out to support CCDF construction, the Kaplan/Garrick ordered triple representation for risk, and determination of scenarios and scenario probabilities.  相似文献   

15.
Three applications of sampling-based sensitivity analysis in conjunction with evidence theory representations for epistemic uncertainty in model inputs are described: (i) an initial exploratory analysis to assess model behavior and provide insights for additional analysis; (ii) a stepwise analysis showing the incremental effects of uncertain variables on complementary cumulative belief functions and complementary cumulative plausibility functions; and (iii) a summary analysis showing a spectrum of variance-based sensitivity analysis results that derive from probability spaces that are consistent with the evidence space under consideration.  相似文献   

16.
Development of probabilistic sensitivities is frequently considered an essential component of a probabilistic analysis and often critical towards understanding the physical mechanisms underlying failure and modifying the design to mitigate and manage risk. One useful sensitivity is the partial derivative of the probability-of-failure and/or the system response with respect to the parameters of the independent input random variables. Calculation of these partial derivatives has been established in terms of an expected value operation (sometimes called the score function or likelihood ratio method). The partial derivatives can be computed with typically insignificant additional computational cost given the failure samples and kernel functions — which are the partial derivatives of the log of the probability density function (PDF) with respect to the parameters of the distribution. The formulation is general such that any sampling method can be used for the computation such as Monte Carlo, importance sampling, Latin hypercube, etc. In this paper, useful universal properties of the kernel functions that must be satisfied for all two parameter independent distributions are derived. These properties are then used to develop distribution-free analytical expressions of the partial derivatives of the response moments (mean and standard deviation) with respect to the PDF parameters for linear and quadratic response functions. These universal properties can be used to facilitate development and verification of the required kernel functions and to develop an improved understanding of the model for design considerations.  相似文献   

17.
This paper focuses on the computation of statistical moments of strains and stresses in a random system model where uncertainty is modeled by a stochastic finite element method based on the polynomial chaos expansion. It identifies the cases where this objective can be achieved by analytical means using the orthogonality property of the chaos polynomials and those where it requires a numerical integration technique. To this effect, the applicability and efficiency of several numerical integration schemes are considered. These include the Gauss–Hermite quadrature with the direct tensor product—also known as the Kronecker product—Smolyak's approximation of such a tensor product, Monte Carlo sampling, and the Latin Hypercube sampling method. An algorithm for reducing the dimensionality of integration under a direct tensor product is also explored for optimizing the computational cost and complexity. The convergence rate and algorithmic complexity of all of these methods are discussed and illustrated with the non‐deterministic linear stress analysis of a plate. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

18.
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSIC). Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify their impact on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a Monte Carlo double-loop, requires a large number of model evaluations, which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a Monte Carlo single-loop with a limited calculation budget. First, we build a unique sample of inputs and simulator outputs, from a well-chosen probability distribution of inputs. From this sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Subsequently, we define 2nd-level HSIC-based measures between the distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.  相似文献   

19.
The paper presents a model that extends the stochastic finite element method to the modelling of transitional energetic–statistical size effect in unnotched quasibrittle structures of positive geometry (i.e. failing at the start of macro‐crack growth), and to the low probability tail of structural strength distribution, important for safe design. For small structures, the model captures the energetic (deterministic) part of size effect and, for large structures, it converges to Weibull statistical size effect required by the weakest‐link model of extreme value statistics. Prediction of the tail of extremely low probability such as one in a million, which needs to be known for safe design, is made feasible by the fact that the form of the cumulative distribution function (cdf) of a quasibrittle structure of any size has been established analytically in previous work. Thus, it is not necessary to turn to sophisticated methods such as importance sampling and it suffices to calibrate only the mean and variance of this cdf. Two kinds of stratified sampling of strength in a finite element code are studied. One is the Latin hypercube sampling of the strength of each element considered as an independent random variable, and the other is the Latin square design in which the strength of each element is sampled from one overall cdf of random material strength. The former is found to give a closer estimate of variance, while the latter gives a cdf with smaller scatter and a better mean for the same number of simulations. For large structures, the number of simulations required to obtain the mean size effect is greatly reduced by adopting the previously proposed method of random property blocks. Each block is assumed to have a homogeneous random material strength, the mean and variance of which are scaled down according to the block size using the weakest‐link model for a finite number of links. To check whether the theoretical cdf is followed at least up to tail beginning at the failure probability of about 0.01, a hybrid of stratified sampling and Monte Carlo simulations in the lowest probability stratum is used. With the present method, the probability distribution of strength of quasibrittle structures of positive geometry can be easily estimated for any structure size. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

20.
The Waste Isolation Pilot Plant (WIPP) is under development by the US Department of Energy (DOE) for the geologic disposal of transuranic waste. The construction of complementary cumulative distribution functions (CCDFs) for total radionuclide release from the WIPP to the accessible environment is described. The resultant CCDFs (i) combine releases due to cuttings and cavings, spallings, direct brine release, and long-term transport in flowing groundwater; (ii) fall substantially to the left of the boundary line specified by the US Environmental Protection Agency's (EPA's) standard 40 CFR 191 for the geologic disposal of radioactive waste; and (iii) constitute an important component of the DOE's successful Compliance Certification Application to the EPA for the WIPP. Insights and perspectives gained in the performance assessment (PA) that led to these CCDFs are described, including the importance of: (i) an iterative approach to PA; (ii) uncertainty and sensitivity analysis; (iii) a clear conceptual model for the analysis; (iv) the separation of stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty; (v) quality assurance procedures; (vi) early involvement of peer reviewers, regulators, and stakeholders; (vii) avoidance of conservative assumptions; and (viii) adequate documentation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号