首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
There are difficulties with probability as a representation of uncertainty. However, we argue that there is an important distinction between principle and practice. In principle, probability is uniquely appropriate for the representation and quantification of all forms of uncertainty; it is in this sense that we claim that ‘probability is perfect’. In practice, people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distributions is a far from perfect process. We therefore argue that there is no need for alternative theories, but that any practical elicitation of expert knowledge must fully acknowledge imprecision in the resulting distribution.We outline a recently developed Bayesian technique that allows the imprecision in elicitation to be formulated explicitly, and apply it to some of the challenge problems.  相似文献   

2.
Expert elicitation approach for performing ATHEANA quantification   总被引:3,自引:1,他引:2  
An expert elicitation approach has been developed to estimate probabilities for unsafe human actions (UAs) based on error-forcing contexts (EFCs) identified through the ATHEANA (A Technique for Human Event Analysis) search process. The expert elicitation approach integrates the knowledge of informed analysts to quantify UAs and treat uncertainty (‘quantification-including-uncertainty’). The analysis focuses on (a) the probabilistic risk assessment (PRA) sequence EFCs for which the UAs are being assessed, (b) the knowledge and experience of analysts (who should include trainers, operations staff, and PRA/human reliability analysis experts), and (c) facilitated translation of information into probabilities useful for PRA purposes. Rather than simply asking the analysts their opinion about failure probabilities, the approach emphasizes asking the analysts what experience and information they have that is relevant to the probability of failure. The facilitator then leads the group in combining the different kinds of information into a consensus probability distribution. This paper describes the expert elicitation process, presents its technical basis, and discusses the controls that are exercised to use it appropriately. The paper also points out the strengths and weaknesses of the approach and how it can be improved. Specifically, it describes how generalized contextually anchored probabilities (GCAPs) can be developed to serve as reference points for estimates of the likelihood of UAs and their distributions.  相似文献   

3.
This article looks at a new approach to expert elicitation that combines basic elements of conventional expert elicitation protocols with formal survey methods and larger, heterogeneous expert panels. This approach is appropriate where the hazard-estimation task requires a wide range of expertise and professional experience. The ability to judge when to rely on alternative data sources often is critical for successful risk management. We show how a large, heterogeneous sample can support internal validation of not only the experts’ assessments but also prior information that is based on limited historical data.We illustrate the use of this new approach to expert elicitation by addressing a fundamental problem in US food safety management, obtaining comparable food system-wide estimates of the foodborne illness by pathogen–food pair and by food. The only comprehensive basis for food-level hazard analysis throughout the US food supply currently available is outbreak data (i.e., when two or more people become ill from the same food source), but there is good reason to question the portrayal that outbreak data alone gives of food risk. In this paper, we compare results of food and pathogen–food incidence estimates based on expert judgment and based on outbreak data, and we demonstrate a suite of uncertainty measures that allow for a fuller understanding of the results.  相似文献   

4.
The quantification of a risk assessment model often requires the elicitation of expert judgments about quantities that cannot be precisely measured. The aims of the model being quantified provide important guidance as to the types of questions that should be asked of the experts. The uncertainties underlying a quantity may be classified as aleatory or epistemic according to the goals of the risk process. This paper discusses the nature of such a classification and how it affects the probability elicitation process and implementation of the resulting judgments. Examples from various areas of risk assessment are used to show the practical implications of how uncertainties are treated. An extended example from hazardous waste disposal is given.  相似文献   

5.
The Bayesian framework for statistical inference offers the possibility of taking expert opinions into account, and is therefore attractive in practical problems concerning the reliability of technical systems. Probability is the only language in which uncertainty can be consistently expressed, and this requires the use of prior distributions for reporting expert opinions. In this paper an extension of the standard Bayesian approach based on the theory of imprecise probabilities and intervals of measures is developed. It is shown that this is necessary to take the nature of experts' knowledge into account. The application of this approach in reliability theory is outlined. The concept of imprecise probabilities allows us to accept a range of possible probabilities from an expert for events of interest and thus makes the elicitation of prior information simpler and clearer. The method also provides a consistent way for combining the opinions of several experts.  相似文献   

6.
Complex natural phenomena are increasingly investigated by the use of a complex computer simulator. To leverage the advantages of simulators, observational data need to be incorporated in a probabilistic framework so that uncertainties can be quantified. A popular framework for such experiments is the statistical computer model calibration experiment. A limitation often encountered in current statistical approaches for such experiments is the difficulty in modeling high-dimensional observational datasets and simulator outputs as well as high-dimensional inputs. As the complexity of simulators seems to only grow, this challenge will continue unabated. In this article, we develop a Bayesian statistical calibration approach that is ideally suited for such challenging calibration problems. Our approach leverages recent ideas from Bayesian additive regression Tree models to construct a random basis representation of the simulator outputs and observational data. The approach can flexibly handle high-dimensional datasets, high-dimensional simulator inputs, and calibration parameters while quantifying important sources of uncertainty in the resulting inference. We demonstrate our methodology on a CO2 emissions rate calibration problem, and on a complex simulator of subterranean radionuclide dispersion, which simulates the spatial–temporal diffusion of radionuclides released during nuclear bomb tests at the Nevada Test Site. Supplementary computer code and datasets are available online.  相似文献   

7.
Topology optimization using stress constraints and considering uncertainties is a serious challenge, since a reliability problem has to be solved for each stress constraint, for each element in the mesh. In this paper, an alternative way of solving this problem is used, where uncertainty quantification is performed through the first‐order perturbation approach, with proper validation by Monte Carlo simulation. Uncertainties are considered in the loading magnitude and direction. The minimum volume problem subjected to local stress constraints is formulated as a robust problem, where the stress constraints are written as a weighted average between their expected value and standard deviation. The augmented Lagrangian method is used for handling the large set of local stress constraints, whereas a gradient‐based algorithm is used for handling the bounding constraints. It is shown that even in the presence of small uncertainties in loading direction, different topologies are obtained when compared to a deterministic approach. The effect of correlation between uncertainties in loading magnitude and direction on optimal topologies is also studied, where the main observed result is loss of symmetry in optimal topologies.  相似文献   

8.
We discuss why coherent lower previsions provide a good uncertainty model for solving generic uncertainty problems involving possibly conflicting expert information. We study various ways of combining expert assessments on different domains, such as natural extension, independent natural extension and the type-I product, as well as on common domains, such as conjunction and disjunction. We provide each of these with a clear interpretation, and we study how they are related. Observing that in combining expert assessments no information is available about the order in which they should be combined, we suggest that the final result should be independent of the order of combination. The rules of combination we study here satisfy this requirement.  相似文献   

9.
The problem of ranking and weighting experts' performances when quantitative judgments are being elicited for decision support is considered. A new scoring model, the Expected Relative Frequency model, is presented, based on the closeness between central values provided by the expert and known values used for calibration. Using responses from experts in five different elicitation datasets, a cross-validation technique is used to compare this new approach with the Cooke Classical Model, the Equal Weights model, and individual experts. The analysis is performed using alternative reward schemes designed to capture proficiency either in quantifying uncertainty, or in estimating true central values. Results show that although there is only a limited probability that one approach is consistently better than another, the Cooke Classical Model is generally the most suitable for assessing uncertainties, whereas the new ERF model should be preferred if the goal is central value estimation accuracy.  相似文献   

10.
This paper addresses the challenge of design optimization under uncertainty when the designer only has limited data to characterize uncertain variables. We demonstrate that the error incurred when estimating a probability distribution from limited data affects the out-of-sample performance (ie, performance under the true distribution) of optimized designs. We demonstrate how this can be mitigated by reformulating the engineering design problem as a distributionally robust optimization (DRO) problem. We present computationally efficient algorithms for solving the resulting DRO problem. The performance of the DRO approach is explored in a practical setting by applying it to an acoustic horn design problem. The DRO approach is compared against traditional approaches to optimization under uncertainty, namely, sample-average approximation and multiobjective optimization incorporating a risk reduction objective. In contrast with the multiobjective approach, the proposed DRO approach does not use an explicit risk reduction objective but rather specifies a so-called ambiguity set of possible distributions and optimizes against the worst-case distribution in this set. Our results show that the DRO designs, in some cases, significantly outperform those designs found using the sample-average or the multiobjective approach.  相似文献   

11.
Combinatorial optimization problems are often too complex to be solved within reasonable time limits by exact methods, in spite of the theoretical guarantee that such methods will ultimately obtain an optimal solution. Instead, heuristic methods, which do not offer a convergence guarantee, but which have greater flexibility to take advantage of special properties of the search space, are commonly a preferred alternative. The standard procedure is to craft a heuristic method to suit the particular characteristics of the problem at hand, exploiting to the extent possible the structure available. Such tailored methods, however, typically have limited usefulness in other problems domains.An alternative to this problem specific solution approach is a more general methodology that recasts a given problem into a common modeling format, permitting solutions to be derived by a common, rather than tailor-made, heuristic method. Because such general purpose heuristic approaches forego the opportunity to capitalize on domain-specific knowledge, they are characteristically unable to provide the effectiveness or efficiency of special purpose approaches. Indeed, they are typically regarded to have little value except for dealing with small or simple problems.This paper reports on recent work that calls this commonly held view into question. We describe how a particular unified modeling framework, coupled with latest advances in heuristic search methods, makes it possible to solve problems from a wide range of important model classes.Correspondence to: Gary A. Kochenberger.This research was supported in part by ONR grants N000140010598 and N000140310621.  相似文献   

12.
13.
Many engineering optimization problems include unavoidable uncertainties in parameters or variables. Ignoring such uncertainties when solving the optimization problems may lead to inferior solutions that may even violate problem constraints. Another challenge in most engineering optimization problems is having different conflicting objectives that cannot be minimized simultaneously. Finding a balanced trade-off between these objectives is a complex and time-consuming task. In this paper, an optimization framework is proposed to address both of these challenges. First, we exploit a self-calibrating multi-objective framework to achieve a balanced trade-off between the conflicting objectives. Then, we develop the robust counterpart of the uncertainty-aware self-calibrating multi-objective optimization framework. The significance of this framework is that it does not need any manual tuning by the designer. We also develop a mathematical demonstration of the objective scale invariance property of the proposed framework. The engineering problem considered in this paper to illustrate the effectiveness of the proposed framework is a popular sizing problem in digital integrated circuit design. However, the proposed framework can be applied to any uncertain multi-objective optimization problem that can be formulated in the geometric programming format. We propose to consider variations in the sizes of circuit elements during the optimization process by employing ellipsoidal uncertainty model. For validation, several industrial clock networks are sized by the proposed framework. The results show a significant reduction in one objective (power, on average 38 %) as well as significant increase in the robustness of solutions to the variations. This is achieved with no significant degradation in the other objective (timing metrics of the circuit) or reduction in its standard deviation which demonstrates a more robust solution.  相似文献   

14.
In this paper we develop an expert system for multiple-criteria facility layout problems. The facility layout problem is identified as an ill-structured problem; our approach for solving it is based on expert systems and multiple-criteria decision making (MCDM). The expert system interacts with the decision maker (DM), and reflects the DM's preferences in the selection of rules and priorities. The inference engine is a forward-chaining reasoning procedure which is discussed in detail. The approach consists of two parts: (a) construction of a layout based on a set of rules and restrictions, and (b) improvement of the layout based on interaction with decision maker. The MCDM expert system approach considers and incorporates the multiple criteria in these two parts as follows. In (a) it uses priorities on the selection of rules, adjacency of departments, and departments for construction purposes. In (b) it uses different objectives such as materials handling cost, flexibility, and materials handling time for paired comparison of generated layouts for improvement purposes. Some experiments with the developed computer package are reported and an example is solved.  相似文献   

15.
Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences.  相似文献   

16.
Injection mould design generally lies on the critical path of a new product development. The design efficiency will have significant impact on the overall lead time of a new product. This paper presents a prototype injection mould-design system using a hybrid case-based reasoning (HCBR) approach. Case-based reasoning (CBR) is a solving paradigm that uses previous episodes on solving problems similar to the problem at hand (the new problem) as the basis for solving the new problem. In this hybrid system, CBR is incorporated with generalized design knowledge, and provides a flexible and comprehensive model of design. The knowledge base of the system would be accessed by mould designers through interactive programs so that their own intelligence and experience could also be incorporated with the total mould design. The approach provides a workable model of mould design system with CBR and knowledge-based expert system intelligent support, which could suggest good and proven design solutions to new design problems quickly, avoiding the time necessary to create those designs from scratch, for the plastic products manufacturing industry.  相似文献   

17.
Concurrent Engineering aims to incorporate the overlapping of processes in order to reduce its time-to-market and thereby sustain the existence of organizations in increasingly competitive times. Although faster product design, development, and delivery are the intended outcomes of concurrent engineering, one of the undesirable by-products is an increase in risks as a consequence of uncertainties between interdependent processes. Hence, the risks need to be identified, assessed, and mitigated together with concurrent engineering considerations for the elimination of the ‘domino-effect’ within risk management. This paper concentrates primarily on knowledge elicitation techniques that were used to provide information to the Intelligent Risk Mapping and Assessment System (IRMAS?) to identify, prioritise, analyse, and assist project managers to manage perceived sources of CE risks. Techniques such as expert interviews, brainstorming, the Delphi technique, and the analogy process are discussed in relation to compiling the knowledge used for this expert system. A total of 589 risk items were identified for different project types, and information on 4372 items and 136 lessons learned were collected from experts at HdH. The core of the research is a reasoning methodology used for Knowledge Elicitation of a Risk Mapping and Assessment System which will not only support the decision-making process of the user but also aid the knowledge retrieval, storage, sharing, and updating process of manufacturing organizations. This research provides a systematic engineering approach to risk management of concurrent product and process development.  相似文献   

18.
The explicit representation of domain knowledge and its separation from the processes which manipulate it and the representation formalism particular to artificial intelligence allow expert systems to solve problems which are characterized by a high combinatoric complexity or which are sufficiently ill defined as to not have reasonable software engineering solutions. The expert system approach to problem-solving differs radically from its conventional system development counterpart. This paper defines the expert system and introduces the production system architecture. The relative strengths and weakenesses of expert system and software engineering approaches to problem solving are discussed. Also addressed are criteria for identifying problems amenable to expert system solution and some justifications for system development.Paper presented at the Ninth International Thermal Expansion Symposium, December 8–10, 1986, Pittsburgh, Pennsylvania, U.S.A.  相似文献   

19.
Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made.This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network.This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.  相似文献   

20.
Recently, a novel nonparametric probabilistic method for modeling and quantifying model-form uncertainties in nonlinear computational mechanics was proposed. Its potential was demonstrated through several uncertainty quantification (UQ) applications in vibration analysis and nonlinear computational structural dynamics. This method, which relies on projection-based model order reduction to achieve computational feasibility, exhibits a vector-valued hyperparameter in the probability model of the random reduced-order basis and associated stochastic projection-based reduced-order model. It identifies this hyperparameter by formulating a statistical inverse problem, grounded in target quantities of interest, and solving the corresponding nonconvex optimization problem. For many practical applications, however, this identification approach is computationally intensive. For this reason, this paper presents a faster predictor-corrector approach for determining the appropriate value of the vector-valued hyperparameter that is based on a probabilistic learning on manifolds. It also demonstrates the computational advantages of this alternative identification approach through the UQ of two three-dimensional nonlinear structural dynamics problems associated with two different configurations of a microelectromechanical systems device.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号