首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

2.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

3.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

4.
The current challenge of nuclear weapon stockpile certification is to assess the reliability of complex, high-consequent, and aging systems without the benefit of full-system test data. In the absence of full-system testing, disparate kinds of information are used to inform certification assessments such as archival data, experimental data on partial systems, data on related or similar systems, computer models and simulations, and expert knowledge. In some instances, data can be scarce and information incomplete. The challenge of Quantification of Margins and Uncertainties (QMU) is to develop a methodology to support decision-making in this informational context. Given the difficulty presented by mixed and incomplete information, we contend that the uncertainty representation for the QMU methodology should be expanded to include more general characterizations that reflect imperfect information. One type of generalized uncertainty representation, known as probability bounds analysis, constitutes the union of probability theory and interval analysis where a class of distributions is defined by two bounding distributions. This has the advantage of rigorously bounding the uncertainty when inputs are imperfectly known. We argue for the inclusion of probability bounds analysis as one of many tools that are relevant for QMU and demonstrate its usefulness as compared to other methods in a reliability example with imperfect input information.  相似文献   

5.
Performance assessment of complex systems is ideally done through full system-level testing which is seldom available for high consequence systems. Further, a reality of engineering practice is that some features of system behavior are not known from experimental data, but from expert assessment, only. On the other hand, individual component data, which are part of the full system are more readily available. The lack of system level data and the complexity of the system lead to a need to build computational models of a system in a hierarchical or building block approach (from simple components to the full system). The models are then used for performance prediction in lieu of experiments, to estimate the confidence in the performance of these systems. Central to this are the need to quantify the uncertainties present in the system and to compare the system response to an expected performance measure. This is the basic idea behind Quantification of Margins and Uncertainties (QMU). QMU is applied in decision making—there are many uncertainties caused by inherent variability (aleatoric) in materials, configurations, environments, etc., and lack of information (epistemic) in models for deterministic and random variables that influence system behavior and performance. This paper proposes a methodology to quantify margins and uncertainty in the presence of both aleatoric and epistemic uncertainty. It presents a framework based on Bayes networks to use available data at multiple levels of complexity (i.e. components, subsystem, etc.) and demonstrates a method to incorporate epistemic uncertainty given in terms of intervals on a model parameter.  相似文献   

6.
Quantification of margins and uncertainties (QMU) was originally introduced as a framework for assessing confidence in nuclear weapons, and has since been extended to more general complex systems. We show that when uncertainties are strictly bounded, QMU is equivalent to a graphical model, provided confidence is identified with reliability one. In the more realistic case that uncertainties have long tails, we find that QMU confidence is not always a good proxy for reliability, as computed from the graphical model. We explore the possibility of defining QMU in terms of the graphical model, rather than through the original procedures. The new formalism, which we call probabilistic QMU, or pQMU, is fully probabilistic and mathematically consistent, and shows how QMU may be interpreted within the framework of system reliability theory.  相似文献   

7.
8.
Influenza pandemics present a global threat owing to their potential mortality and substantial economic impacts. Stockpiling antiviral drugs to manage a pandemic is an effective strategy to offset their negative impacts; however, little is known about the long-term optimal size of the stockpile under uncertainty and the characteristics of different countries. Using an epidemic–economic model we studied the effect on total mortality and costs of antiviral stockpile sizes for Brazil, China, Guatemala, India, Indonesia, New Zealand, Singapore, the UK, the USA and Zimbabwe. In the model, antivirals stockpiling considerably reduced mortality. There was greater potential avoidance of expected costs in the higher resourced countries (e.g. from $55 billion to $27 billion over a 30 year time horizon for the USA) and large avoidance of fatalities in those less resourced (e.g. from 11.4 to 2.3 million in Indonesia). Under perfect allocation, higher resourced countries should aim to store antiviral stockpiles able to cover at least 15 per cent of their population, rising to 25 per cent with 30 per cent misallocation, to minimize fatalities and economic costs. Stockpiling is estimated not to be cost-effective for two-thirds of the world''s population under current antivirals pricing. Lower prices and international cooperation are necessary to make the life-saving potential of antivirals cost-effective in resource-limited countries.  相似文献   

9.
Uncertainty, probability and information-gaps   总被引:1,自引:0,他引:1  
This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model.Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success?We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems.  相似文献   

10.
In this paper, we present an application of sensitivity analysis for design verification of nuclear turbosets. Before the acquisition of a turbogenerator, energy power operators perform independent design assessment in order to assure safe operating conditions of the new machine in its environment. Variables of interest are related to the vibration behaviour of the machine: its eigenfrequencies and dynamic sensitivity to unbalance. In the framework of design verification, epistemic uncertainties are preponderant. This lack of knowledge is due to inexistent or imprecise information about the design as well as to interaction of the rotating machinery with supporting and sub-structures. Sensitivity analysis enables the analyst to rank sources of uncertainty with respect to their importance and, possibly, to screen out insignificant sources of uncertainty. Further studies, if necessary, can then focus on predominant parameters. In particular, the constructor can be asked for detailed information only about the most significant parameters.  相似文献   

11.
探讨了传统的人因可靠性分析方法的特点及局限性,介绍了一种新型的人因可靠性分析方法--ATHEANA法的基本思想和概念、所基于的模型、分析框架、实施步骤及特点。最后,给出了ATHEANA法在核电站的一个应用实例。  相似文献   

12.
It has been shown (McBride, 2006) that under a scale model stockpile a central pressure dip, or 'M' pressure, is formed during both the filling and refilling processes of a stockpile containing a central reclaim channel. This article presents the results from further experiments with a noncentral reclaim channel and clearly illustrates that a central pressure dip exists under a conical stockpile with an offset reclaim channel. Additional data were recorded regarding lateral load transfer within the stockpile on initiation of flow, although more experimental work is required to fully understand this aspect. A method for predicting conical stockpile base pressures is presented along with an overpressure theory to account for the vertical load instability induced by the onset of material discharge.  相似文献   

13.
The operability limits of a supersonic combustion engine for an air-breathing hypersonic vehicle are characterized using numerical simulations and an uncertainty quantification methodology. The time-dependent compressible flow equations with heat release are solved in a simplified configuration. Verification, calibration and validation are carried out to assess the ability of the model to reproduce the flow/thermal interactions that occur when the engine unstarts due to thermal choking. quantification of margins and uncertainty (QMU) is used to determine the safe operation region for a range of fuel flow rates and combustor geometries.  相似文献   

14.
金属铀的水蒸气腐蚀行为研究现状   总被引:1,自引:0,他引:1  
秦建伟  罗丽珠  帅茂兵 《材料导报》2017,31(13):17-24, 32
金属铀是一种重要的核材料,在国防工业和能源系统中得到广泛应用,由于其具有高化学活性,当贮存环境中含有微量水蒸气时容易发生腐蚀而影响其使用性能。为深入认识金属铀在含水环境中的腐蚀老化过程,研究人员开展了大量科学研究。围绕腐蚀产物、腐蚀动力学以及腐蚀机理综述了国内外关于金属铀与水反应过程的主要研究成果,从氧化铀缺陷结构、金属铀微观结构对反应过程的影响以及O_2的氧化抑制机理等方面对下一步研究方向进行了展望。  相似文献   

15.
J.M. Williams 《低温学》1975,15(6):307-322
The basic ideas and difficulties associated with nuclear gamma resonance absorption are discussed culminating with Mössbauer's great discovery of recoiless gamma emission and absorption. The extremely high inherent resolution of the Mössbauer γ-rays provides a simple and powerful method of detecting extremely small changes in energy. In this review the applications of Mössbauer spectroscopy to problems in magnetism, superconductivity, Kondo effect, nuclear polarization and relativity are discussed, with particular regard to use of very low temperatures and the information thus obtained. Basic Mössbauer systems and cryostats which enable temperatures down to ~30 mK to be maintained for long periods are described. An interesting application discussed is where the Mössbauer effect may itself be used as an absolute thermometer incorporating its own internal calibration.  相似文献   

16.
We describe the use of Bayesian inference to include prior information about the value of the measurand in the calculation of measurement uncertainty. Typical examples show this can, in effect, reduce the expanded uncertainty by up to 85 %. The application of the Bayesian approach to proving workpiece conformance to specification (as given by international standard ISO 14253-1) is presented and a procedure for increasing the conformance zone by modifying the expanded uncertainty guard bands is discussed.  相似文献   

17.
Stockpiles are common for storage of bulk solids in many industrial sectors. One interesting phenomenon is that there is a significant dip of the base pressure beneath the apex of the pile which may have significant implications in the design of stockpile facilities and related support structures. This paper presents a numerical and experimental study of this phenomenon. Experiments have been conducted to measure the base pressure distribution under a stockpile formed with iron ore pellets and significant central stress minimum was revealed. Continuum analysis using the finite element method (FEM) was conducted to simulate the stress distribution in the test pile. It showed the critical importance of progressive pile development and nonlinear constitutive models. To investigate the underlying mechanisms further, simulations using the discrete element method (DEM) were conducted, which related well to the FEM predictions and revealed key aspects of the inter-particle force patterns. Both the FEM and DEM predictions show a dip in the base pressure distribution which is in agreement with test results, and reveal new key features of the mechanics of such piles.  相似文献   

18.
Building a three-dimensional (3-D) computer-based representation of a large or complex scene from a set of range images requires assembling the range images so that they are spatially registered. This paper will present a technique for finding a homogeneous transformation between a pair of range images while making use of both range and intensity information obtained using a laser range scanner. A demonstrated application featuring this technique consists of building a 3-D representation of a nuclear dumpsite, which then serves to help control a teleoperated nuclear waste remediation system  相似文献   

19.
The United Nations High Commissioner for Refugees (UNHCR) establishes and maintains refugee camps to meet the needs of 34.5 million people affected by disaster or war worldwide. Like other international humanitarian organizations, UNHCR maintains central stockpiles which supply these refugee operations. Management at UNHCR seeks to improve the timeliness and quality of its disaster response subject to its budget constraints. We develop an inventory model to analyze the interaction between a stockpile and a downstream refugee camp or relief operation. We consider two inventory decisions: first, how to partition a fixed budget between stockpiling and shipping costs in order to best meet the needs of beneficiaries; and second, given the shipping budget determined by the budget partition, how to ship relief items from the stockpile to a downstream relief operation in an efficient manner. We solve for the shipment policy using dynamic programming, then determine the optimal stockpile size given knowledge of the optimal shipment policy. The optimization balances a key tradeoff: a larger stockpile is costly to procure and maintain, but enhances a humanitarian organization’s ability to respond to relief operation demands. We provide insights into shipment strategies and stockpile size. We also develop a spreadsheet model to help humanitarian organizations in their operational decision-making, leading to improved response to beneficiaries. Humanitarian organizations must use their financial resources wisely to carry out their mandates, and models of this type can help them make the best use of their limited response resources.  相似文献   

20.
随着土地部门工作性质和需求的不断变化及深入,将GIS软件应用于土地管理也日趋体现出其价值所在。本文在阐释GIS与土地管理信息系统的基本概念以及二者相互关系的基础之上,以上海市丝绸之路有限公司所开发的浦东新区土地管理信息系统为例,详细介绍了系统的设计思想、功能模块以及关键技术,从而全面地、详细地、多方位地展示了GIS在土地管理中所发挥出的强大优势和显著成效。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号