首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到9条相似文献,搜索用时 15 毫秒
1.
The following techniques for uncertainty and sensitivity analysis are briefly summarized: Monte Carlo analysis, differential analysis, response surface methodology, Fourier amplitude sensitivity test, Sobol' variance decomposition, and fast probability integration. Desirable features of Monte Carlo analysis in conjunction with Latin hypercube sampling are described in discussions of the following topics: (i) properties of random, stratified and Latin hypercube sampling, (ii) comparisons of random and Latin hypercube sampling, (iii) operations involving Latin hypercube sampling (i.e. correlation control, reweighting of samples to incorporate changed distributions, replicated sampling to test reproducibility of results), (iv) uncertainty analysis (i.e. cumulative distribution functions, complementary cumulative distribution functions, box plots), (v) sensitivity analysis (i.e. scatterplots, regression analysis, correlation analysis, rank transformations, searches for nonrandom patterns), and (vi) analyses involving stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty.  相似文献   

2.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

3.
Effects of uncertainties in gas damping models, geometry and mechanical properties on the dynamics of micro-electro-mechanical systems (MEMS) capacitive switch are studied. A sample of typical capacitive switches has been fabricated and characterized at Purdue University. High-fidelity simulations of gas damping on planar microbeams are developed and verified under relevant conditions. This and other gas damping models are then applied to study the dynamics of a single closing event for switches with experimentally measured properties. It has been demonstrated that although all damping models considered predict similar damping quality factor and agree well for predictions of closing time, the models differ by a factor of two and more in predicting the impact velocity and acceleration at contact. Implications of parameter uncertainties on the key reliability-related parameters such as the pull-in voltage, closing time and impact velocity are discussed. A notable effect of uncertainty is that the nominal switch, i.e. the switch with the average properties, does not actuate at the mean actuation voltage. Additionally, the device-to-device variability leads to significant differences in dynamics. For example, the mean impact velocity for switches actuated under the 90%-actuation voltage (about 150 V), i.e. the voltage required to actuate 90% of the sample, is about 129 cm/s and increases to 173 cm/s for the 99%-actuation voltage (of about 173 V). Response surfaces of impact velocity and closing time to five input variables were constructed using the Smolyak sparse grid algorithm. The sensitivity analysis showed that impact velocity is most sensitive to the damping coefficient whereas the closing time is most affected by the geometric parameters such as gap and beam thickness.  相似文献   

4.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

5.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

6.
A deep geologic repository for high-level radioactive waste is under development by the US Department of Energy (DOE) at Yucca Mountain (YM), Nevada. As mandated in the Energy Policy Act of 1992, the US Environmental Protection Agency has promulgated public health and safety standards (i.e., 40 CFR Part 197) for the YM repository, and the US Nuclear Regulatory Commission has promulgated licensing standards (i.e., 10 CFR Parts 2, 19, 20, etc.) consistent with 40 CFR Part 197 that the DOE must establish are met in order for the YM repository to be licensed for operation. Important requirements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. relate to the determination of expected (i.e., mean) dose to a reasonably maximally exposed individual (RMEI) and the incorporation of uncertainty into this determination. This paper is the first part of a two-part presentation and describes how general and typically nonquantitative statements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. can be given a formal mathematical structure that facilitates both the calculation of expected dose to the RMEI and the appropriate separation in this calculation of aleatory uncertainty (i.e., randomness in the properties of future occurrences such as igneous and seismic events) and epistemic uncertainty (i.e., lack of knowledge about quantities that are imprecisely known but assumed to have constant values in the calculation of expected dose to the RMEI). The second part of this presentation is contained in the following paper, “Computational Implementation of Sampling-Based Approaches to the Calculation of Expected Dose in Performance Assessments for the Proposed High-Level Radioactive Waste Repository at Yucca Mountain, Nevada,” and both describes and illustrates sampling-based procedures for the estimation of expected dose and the determination of the uncertainty in estimates for expected dose.  相似文献   

7.
A deep geologic repository for high-level radioactive waste is under development by the US Department of Energy (DOE) at Yucca Mountain (YM), Nevada. As mandated in the Energy Policy Act of 1992, the US Environmental Protection has promulgated public health and safety standards (i.e., 40 CFR Part 197) for the YM repository, and the US Nuclear Regulatory Commission has promulgated licensing standards (i.e., 10 CFR Parts 2, 19, 20, etc.) consistent with 40 CFR Part 197 that the DOE must establish are met in order for the YM repository to be licensed for operation. Important requirements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. relate to the determination of expected (i.e., mean) dose to a reasonably maximally exposed individual (RMEI) and the incorporation of uncertainty into this determination. This paper is the second part of a two-part presentation on the determination of expected dose to the RMEI in the context of 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. The first part of this presentation is contained in the preceding paper, “Conceptual Basis for the Definition and Calculation of Expected Dose in Performance Assessments for the Proposed High-Level Radioactive Waste Repository at Yucca Mountain, Nevada”, and describes how general and typically nonquantitative statements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. can be given a formal mathematical structure that facilitates both the calculation of expected dose to the RMEI and the appropriate separation in this calculation of aleatory uncertainty (i.e., randomness in the properties of future occurrences such as igneous and seismic events) and epistemic uncertainty (i.e., lack of knowledge about quantities that are poorly known but assumed to have constant values in the calculation of expected dose to the RMEI). The present paper describes and illustrates sampling-based procedures for the estimation of expected dose and the determination of the uncertainty in estimates for expected dose.  相似文献   

8.
The Waste Isolation Pilot Plant (WIPP) is under development by the US Department of Energy (DOE) for the geologic disposal of transuranic waste. The construction of complementary cumulative distribution functions (CCDFs) for total radionuclide release from the WIPP to the accessible environment is described. The resultant CCDFs (i) combine releases due to cuttings and cavings, spallings, direct brine release, and long-term transport in flowing groundwater; (ii) fall substantially to the left of the boundary line specified by the US Environmental Protection Agency's (EPA's) standard 40 CFR 191 for the geologic disposal of radioactive waste; and (iii) constitute an important component of the DOE's successful Compliance Certification Application to the EPA for the WIPP. Insights and perspectives gained in the performance assessment (PA) that led to these CCDFs are described, including the importance of: (i) an iterative approach to PA; (ii) uncertainty and sensitivity analysis; (iii) a clear conceptual model for the analysis; (iv) the separation of stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty; (v) quality assurance procedures; (vi) early involvement of peer reviewers, regulators, and stakeholders; (vii) avoidance of conservative assumptions; and (viii) adequate documentation.  相似文献   

9.
Uncertainty and sensitivity analysis results obtained with random and Latin hypercube sampling are compared. The comparison uses results from a model for two-phase fluid flow obtained with three independent random samples of size 100 each and three independent Latin hypercube samples (LHSs) of size 100 each. Uncertainty and sensitivity analysis results with the two sampling procedures are similar and stable across the three replicated samples. Poor performance of regression-based sensitivity analysis procedures for some analysis outcomes results more from the inappropriateness of the procedure for the nonlinear relationships between model input and model results than from an inadequate sample size. Kendall's coefficient of concordance (KCC) and the top down coefficient of concordance (TDCC) are used to assess the stability of sensitivity analysis results across replicated samples, with the TDCC providing a more informative measure of analysis stability than KCC. A new sensitivity analysis procedure based on replicated samples and the TDCC is introduced.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号