首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation.  相似文献   

2.
The problem of ranking and weighting experts' performances when quantitative judgments are being elicited for decision support is considered. A new scoring model, the Expected Relative Frequency model, is presented, based on the closeness between central values provided by the expert and known values used for calibration. Using responses from experts in five different elicitation datasets, a cross-validation technique is used to compare this new approach with the Cooke Classical Model, the Equal Weights model, and individual experts. The analysis is performed using alternative reward schemes designed to capture proficiency either in quantifying uncertainty, or in estimating true central values. Results show that although there is only a limited probability that one approach is consistently better than another, the Cooke Classical Model is generally the most suitable for assessing uncertainties, whereas the new ERF model should be preferred if the goal is central value estimation accuracy.  相似文献   

3.
Quantifying uncertainty during risk analysis has become an important part of effective decision-making and health risk assessment. However, most risk assessment studies struggle with uncertainty analysis and yet uncertainty with respect to model parameter values is of primary importance. Capturing uncertainty in risk assessment is vital in order to perform a sound risk analysis. In this paper, an approach to uncertainty analysis based on the fuzzy set theory and the Monte Carlo simulation is proposed. The question then arises as to how these two modes of representation of uncertainty can be combined for the purpose of estimating risk. The proposed method is applied to a propylene oxide polymerisation reactor. It takes into account both stochastic and epistemic uncertainties in the risk calculation. This study explores areas where random and fuzzy logic models may be applied to improve risk assessment in industrial plants with a dynamic system (change over time). It discusses the methodology and the process involved when using random and fuzzy logic systems for risk management.  相似文献   

4.
Expert elicitation approach for performing ATHEANA quantification   总被引:3,自引:1,他引:2  
An expert elicitation approach has been developed to estimate probabilities for unsafe human actions (UAs) based on error-forcing contexts (EFCs) identified through the ATHEANA (A Technique for Human Event Analysis) search process. The expert elicitation approach integrates the knowledge of informed analysts to quantify UAs and treat uncertainty (‘quantification-including-uncertainty’). The analysis focuses on (a) the probabilistic risk assessment (PRA) sequence EFCs for which the UAs are being assessed, (b) the knowledge and experience of analysts (who should include trainers, operations staff, and PRA/human reliability analysis experts), and (c) facilitated translation of information into probabilities useful for PRA purposes. Rather than simply asking the analysts their opinion about failure probabilities, the approach emphasizes asking the analysts what experience and information they have that is relevant to the probability of failure. The facilitator then leads the group in combining the different kinds of information into a consensus probability distribution. This paper describes the expert elicitation process, presents its technical basis, and discusses the controls that are exercised to use it appropriately. The paper also points out the strengths and weaknesses of the approach and how it can be improved. Specifically, it describes how generalized contextually anchored probabilities (GCAPs) can be developed to serve as reference points for estimates of the likelihood of UAs and their distributions.  相似文献   

5.
Expert judgments are frequently used in probabilistic safety assessments (PSA). However, the methods employed in practice are very crude and a large gap exists between the theoretical methods available and actual practice. A taxonomy of issues related to the use of expert judgments in PSA was considered necessary to identify the needs of the practitioners and the applicability of existing models. In this paper, a taxonomy of issues related to the use of expert judgments in PSA is systematically reviewed with examples from case studies. Issues surrounding the expert judgement procces can be classified into two categories—(a) elicitation, and (b) the use of expert judgments. Various elements of these categories, such as model and parameter uncertainty, decomposition, the use of multiple experts, the selection of experts, expert training, elicitation, effect of information provided to experts, expert calibration, availability of evidence, opinion aggregation and dependence are then discussed. The issues of expert bias, calibration and dependence are of special concern. Sources of expert bias and dependence are discussed with some thoughts on overcoming them using examples from selected case studies.  相似文献   

6.
Applied avalanche models are based on parameters which cannot be measured directly. As a consequence, these models are associated with large uncertainties, which must be addressed in risk assessment. To this end, we present an integral probabilistic framework for the modelling of avalanche hazards. The framework is based on a deterministic dynamic avalanche model, which is combined with an explicit representation of the different parameter uncertainties. The probability distribution of these uncertainties is then determined from observations of avalanches in the area under investigation through Bayesian inference. This framework facilitates the consistent combination of physical and empirical avalanche models with the available observations and expert knowledge. The resulting probabilistic spatial model can serve as a basis for hazard maping and spatial risk assessment. In this paper, the new model is applied to a case study in a test area located in the Swiss Alps.  相似文献   

7.
Expert judgments are involved in many aspects of scientific research, either formally or informally. In order to combine the different opinions elicited, simple aggregation methods have often been used with the result that expert biases, interexpert dependencies and other factors which might affect the judgments of the experts are often ignored. A more comprehensive approach, based on the analytic hierarchy process, is proposed in this paper to account for the large variety of factors influencing the experts. A structured hierarchy is constructed to decompose the overall problem in the elementary factors that can influence the expert's judgements. The importance of the different elements of the hierarchy is then assessed by pairwise comparison. The overall approach is simple, presents a systematic character and offeres a good degree of flexibility. The approach provides the decision maker with a tool to quantitatively measure the significance of the judgments provided by the different experts involved in the elicitation. The resulting values can be used as weights in an aggregation scheme such as, for example, the simple weighted averaging scheme. Two applications of the approach are presented with reference to case studies of formal expert judgment elicitation previously analyzed in literature: the elicitation of the pressure increment in the containment building of the Sequoyah nuclear power plant following reactor vessel breach, and the prediction of the future changes in precipitation in the vicinity of Yucca Mountain.  相似文献   

8.
This paper proposes a new methodology for incorporating uncertainties using fuzzy concepts into conventional risk assessment frameworks. This paper also introduces new forms of fuzzy membership curves, designed to consider the uncertainty range that represents the degree of uncertainties involved in both probabilistic parameter estimates and subjective judgments, since it is often difficult or even impossible to precisely estimate the occurrence rate of an event in terms of one single crisp probability.It is to be noted that simple linguistic variables such as ‘High/Low’ and ‘Good/Bad’ have the limitations in quantifying the various risks inherent in construction projects, but only represent subjective mental cognition adequately. Therefore, in this paper, the statements that include some quantification with giving specific value or scale, such as ‘Close to any value’ or ‘Higher/Lower than analyzed value’, are used in order to get over the limitations.It may be stated that the proposed methodology will be very useful for the systematic and rational risk assessment of construction projects.  相似文献   

9.
A sound methodology for the elicitation of subjective expert judgement is a pre-requisite for specifying prior distributions for the parameters of reliability growth models. In this paper, we describe an elicitation process that is developed to ensure valid data are collected by suggesting how possible bias might be identified and managed. As well as discussing the theory underpinning the elicitation process, the paper gives practical guidance concerning its implementation during reliability growth testing. The collection of subjective data using the proposed elicitation process is embedded within a Bayesian reliability growth modelling framework and reflections upon its practical use are described.  相似文献   

10.
提出了基于贝叶斯理论的地震风险评估方法,综合考虑了地震危险性模型、输入地震动记录、结构参数和需求模型的不确定性,并以云南大理地区1970年-2017年间的地震数据为研究基础进行了详细讨论。在传统基于概率地震危险性分析方法的基础上,提出了基于贝叶斯理论的地震危险性分析方法,通过贝叶斯更新准则,确定了地震概率模型中未知参数的后验概率分布;通过贝叶斯理论建立了基于概率的地震需求模型,并在易损性中考虑了需求模型认知不确定性的影响;以42层钢框架-RC核心筒建筑为例,开展了地震作用下的风险评估。研究表明:基于贝叶斯理论的地震危险性分析方法,能够获得更为合理的危险性模型;忽略需求模型中参数不确定性的影响,将错误估计结构的地震易损性;不同加载工况将对高层建筑的地震风险产生显著影响。提出的概率风险评估方法,提供了可以考虑固有不确定性和认知不确定性的有效途径,有助于推动高性能结构地震韧性评价和设计理论的发展。  相似文献   

11.
Addressing long-term potential human exposures to, and health risks from contaminants in the subsurface environment requires the use of models. Because these models must project contaminant behavior into the future, and make use of highly variable landscape properties, there is uncertainty associated with predictions of long-term exposure. Many parameters used in both subsurface contaminant transport simulation and health risk assessment have variance owing to uncertainty and/or variability. These parameters are best represented by ranges or probability distributions rather than single values. Based on a case study with information from an actual site contaminated with trichloroethylene (TCE), we demonstrate the propagation of variance in the simulation of risk using a complex subsurface contaminant transport simulation model integrated with a multi-pathway human health risk model. Ranges of subsurface contaminant concentrations are calculated with the subsurface transport simulator T2VOC (using the associated code ITOUGH2 for uncertainty analysis) for a three-dimensional system in which TCE migrates in both the vadose and saturated zones over extended distances and time scales. The subsurface TCE concentration distributions are passed to CalTOX, a multimedia, multi-pathway exposure model, which is used to calculate risk through multiple exposure pathways based on inhalation, ingestion and dermal contact. Monte Carlo and linear methods are used for the propagation of uncertainty owing to parameter variance. We demonstrate how rank correlation can be used to evaluate contributions to overall uncertainty from each model system. In this sample TCE case study, we find that although exposure model uncertainties are significant, subsurface transport uncertainties are dominant.  相似文献   

12.
This paper presents a procedure of modeling uncertainties in the spectral fatigue analysis of offshore structures with reference to the reliability assessment. Uncertainties of the fatigue damage are generally embedded in response characteristics of the stress process and the damage-model used. Besides commonly accepted uncertainties in offshore structural analysis, which are associated with the modeling of structures and the random wave environment, there are also uncertainties arising from joint flexibilities that occur during the response, the wave–current and water–structure interactions. Uncertainties in joint flexibilities are associated with degradation of member connectivities during a response process. Uncertainties introduced by the wave–current interaction are related to the modeling of a random sea state, applied wave loads and water–structure interaction effects in general. The water–structure interaction, which is an important phenomenon to be considered in the analysis of dynamic-sensitive structures, introduces some added hydrodynamic damping. The associated uncertainties are reflected in the response analysis via the damping term. Therefore, in a quasi-static response analysis, these uncertainties disappear. In the spectral fatigue damage, in addition to the uncertainties of stress statistical characteristics there are some other uncertainties associating with the damage-model used. These uncertainties are related to experimentally determined fatigue data and configurations of selected joints at which damages are likely to occur due to high stress concentrations. This paper presents these uncertainty issues with emphasis on the application of a reliability assessment. However, some other uncertainties arise from approximations inherent in the model. They are assumed to be either comparatively negligible or can be considered within the current uncertainty models so that they are not treated further in this paper. In the calculation of the fatigue damage, a non-narrow banded stress process is used.  相似文献   

13.
In future PSAs it is suggested that in general much more attention be given to modeling uncertainties, in particular those that owe to the man-machine interface configuration. Because little has been done so far, this paper puts forward some basic considerations concerning this topic. In particular, some aspects concerning treatment of uncertainties are discussed. Three typical areas where modeling procedures prevail are fault tree methods, risk monitors and process oriented expert systems. These areas were chosen because they cover risk assessment for design qualification and operator support systems concerning optimal risk reduction strategies during plant operation and optimal process control procedures. Because all modeling applications are realized by means of computers, other special areas requiring attention are the ergonomic aspects of the man-machine interface and the general software reliability. In conclusion some recommendations and suggestions concerning modeling uncertainties in the above areas are given.  相似文献   

14.
Risk management is a process that includes several steps, from vulnerability analysis to the formulation of a risk mitigation plan that selects countermeasures to be adopted. With reference to an information infrastructure, we present a risk management strategy that considers a sequence of hierarchical models, each describing dependencies among infrastructure components. A dependency exists anytime a security-related attribute of a component depends upon the attributes of other components. We discuss how this notion supports the formal definition of risk mitigation plan and the evaluation of the infrastructure robustness. A hierarchical relation exists among models that are analyzed because each model increases the level of details of some components in a previous one. Since components and dependencies are modeled through a hypergraph, to increase the model detail level, some hypergraph nodes are replaced by more and more detailed hypergraphs. We show how critical information for the assessment can be automatically deduced from the hypergraph and define conditions that determine cases where a hierarchical decomposition simplifies the assessment. In these cases, the assessment has to analyze the hypergraph that replaces the component rather than applying again all the analyses to a more detailed, and hence larger, hypergraph. We also show how the proposed framework supports the definition of a risk mitigation plan and discuss some indicators of the overall infrastructure robustness. Lastly, the development of tools to support the assessment is discussed.  相似文献   

15.
Concurrent Engineering aims to incorporate the overlapping of processes in order to reduce its time-to-market and thereby sustain the existence of organizations in increasingly competitive times. Although faster product design, development, and delivery are the intended outcomes of concurrent engineering, one of the undesirable by-products is an increase in risks as a consequence of uncertainties between interdependent processes. Hence, the risks need to be identified, assessed, and mitigated together with concurrent engineering considerations for the elimination of the ‘domino-effect’ within risk management. This paper concentrates primarily on knowledge elicitation techniques that were used to provide information to the Intelligent Risk Mapping and Assessment System (IRMAS?) to identify, prioritise, analyse, and assist project managers to manage perceived sources of CE risks. Techniques such as expert interviews, brainstorming, the Delphi technique, and the analogy process are discussed in relation to compiling the knowledge used for this expert system. A total of 589 risk items were identified for different project types, and information on 4372 items and 136 lessons learned were collected from experts at HdH. The core of the research is a reasoning methodology used for Knowledge Elicitation of a Risk Mapping and Assessment System which will not only support the decision-making process of the user but also aid the knowledge retrieval, storage, sharing, and updating process of manufacturing organizations. This research provides a systematic engineering approach to risk management of concurrent product and process development.  相似文献   

16.
By means of several examples from a recent comprehensive space nuclear risk analysis of the Cassini mission, a scenario and consequence representational framework is presented for risk analysis of space nuclear power systems in the context of epistemic and aleatory uncertainties. The framework invites the use of probabilistic models for the calculation of both event probabilities and scenario consequences. Each scenario is associated with a frequency that may include both aleatory and epistemic uncertainties. The outcome of each scenario is described in terms of an end state vector. The outcome of each scenario is also characterized by a source term. In this paper, the source term factors of interest are number of failed clads in the space nuclear power system, amount of fuel released and amount of fuel that is potentially respirable. These are also subject to uncertainties. The 1990 work of Apostolakis is found to be a useful formalism from which to derive the relevant probabilistic models. However, an extension to the formalism was necessary to accommodate the situation in which aleatory uncertainty is represented by changes in the form of the probability function itself, not just its parameters. Event trees that show reasonable alternative accident scenarios are presented. A grouping of probabilities and consequences is proposed as a useful structure for thinking about uncertainties. An example of each category is provided. Concluding observations are made about the judgments involved in this analysis of uncertainties and the effect of distinguishing between aleatory and epistemic uncertainties.  相似文献   

17.
A rupture risk assessment is critical to the clinical treatment of abdominal aortic aneurysm (AAA) patients. The biomechanical AAA rupture risk assessment quantitatively integrates many known AAA rupture risk factors but the variability of risk predictions due to model input uncertainties remains a challenging limitation. This study derives a probabilistic rupture risk index (PRRI). Specifically, the uncertainties in AAA wall thickness and wall strength were considered, and wall stress was predicted with a state-of-the-art deterministic biomechanical model. The discriminative power of PRRI was tested in a diameter-matched cohort of ruptured (n = 7) and intact (n = 7) AAAs and compared to alternative risk assessment methods. Computed PRRI at 1.5 mean arterial pressure was significantly (p = 0.041) higher in ruptured AAAs (20.21(s.d. 14.15%)) than in intact AAAs (3.71(s.d. 5.77)%). PRRI showed a high sensitivity and specificity (discriminative power of 0.837) to discriminate between ruptured and intact AAA cases. The underlying statistical representation of stochastic data of wall thickness, wall strength and peak wall stress had only negligible effects on PRRI computations. Uncertainties in AAA wall stress predictions, the wide range of reported wall strength and the stochastic nature of failure motivate a probabilistic rupture risk assessment. Advanced AAA biomechanical modelling paired with a probabilistic rupture index definition as known from engineering risk assessment seems to be superior to a purely deterministic approach.  相似文献   

18.
Integrating human health and ecological concerns in risk assessments   总被引:4,自引:0,他引:4  
The interconnections between ecosystems, human health and welfare have been increasingly recognized by the US government, academia, and the public. This paper continues this theme by addressing the use of risk assessment to integrate people into a single assessment. In a broad overview of the risk assessment process we stress the need to build a conceptual model of the whole system including multiple species (humans and other ecological entities), stressors, and cumulative effects. We also propose converging landscape ecology and evaluation of ecosystem services with risk assessment to address these cumulative responses. We first look at how this integration can occur within the problem formulation step in risk assessment where the system is defined, a conceptual model created, a subset of components and functions selected, and the analytical framework decided in a context that includes the management decisions. A variety of examples of problem formulations (salmon, wild insects, hyporheic ecosystems, ultraviolet (UV) radiation, nitrogen fertilization, toxic chemicals, and oil spills) are presented to illustrate how treating humans as components of the landscape can add value to risk assessments. We conclude that the risk assessment process should help address the urgent needs of society in proportion to importance, to provide a format to communicate knowledge and understanding, and to inform policy and management decisions.  相似文献   

19.
The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which “experimental” data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport “universe”, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new “experiments” within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies.  相似文献   

20.
The problem addressed is how to combine event experience data from multiple source plants to estimate common cause failure (CCF) rates for a target plant. Alternative models are considered for transforming CCF parameters from systems with different numbers of similar components to obtain CCF-rates for a specific group of components. Two sets of rules are reviewed and compared for transforming rates and assessment uncertainties from larger to smaller systems, i.e. mapping down. Mapping down equations are presented also for the alpha-factors and for the variances of CCF rates. Consistent rules are developed for mapping up CCF-rates and uncertainties from smaller to larger systems. These mapping up rules are not limited to a binomial CCF model. It is shown how consistency requirements set certain limits to possible parametric values. Empirical alpha factors are used to estimate robust mapping parameters, and mapping up equations are derived for alpha factors as well. An assessment uncertainty procedure is presented for treating incomplete or vague information when estimating CCF-rates. Numerical studies illustrate mapping rules and procedures. Recommendations are made for practical applications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号