首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 14 毫秒
1.
This paper addresses the concept of model uncertainty within the context of risk analysis. Though model uncertainty is a topic widely discussed in the risk analysis literature, no consensus seems to exist on its meaning, how it should be measured, or its impact on the application of analysis results in decision processes. The purpose of this paper is to contribute to clarification. The first parts of the paper look into the contents of the two terms ‘model’ and ‘uncertainty’. On this platform it is discussed how focus on model uncertainty merely leads to muddling up the message of the analysis, if risk is interpreted as a true, inherent property of the system, to be estimated in the risk analysis. An alternative approach is to see the models as means for expressing uncertainty regarding the system performance. In this case, it is argued, the term ‘model uncertainty’ loses its meaning.  相似文献   

2.
In this article, the problem of choosing from a set of design alternatives based upon multiple, conflicting, and uncertain criteria is investigated. The problem of selection over multiple attributes becomes harder when risky alternatives exist. The overlap measure method developed in this article models two sources of uncertainties—imprecise or risky attribute values provided to the decision maker and inabilities of the decision-maker to specify an exact desirable attribute value. Effects of these uncertainties are mitigated using the overlap measure metric. A subroutine to this method, called the robust alternative selection method, ensures that the winning alternative is insensitive to changes in the relative importance of the different design attributes. The overlap measure method can be used to model and handle various sources of uncertainties and can be applied to any number of multiattribute decision-making methods. In this article, it is applied to the hypothetical equivalents and inequivalents method, which is a multiattribute selection method under certainty.  相似文献   

3.
A parametric sensitivity analysis is carried out on GASCON, a radiological impact software describing the radionuclides transfer to the man following a chronic gas release of a nuclear facility. An effective dose received by age group can thus be calculated according to a specific radionuclide and to the duration of the release. In this study, we are concerned by 18 output variables, each depending of approximately 50 uncertain input parameters. First, the generation of 1000 Monte-Carlo simulations allows us to calculate correlation coefficients between input parameters and output variables, which give a first overview of important factors. Response surfaces are then constructed in polynomial form, and used to predict system responses at reduced computation time cost; this response surface will be very useful for global sensitivity analysis where thousands of runs are required. Using the response surfaces, we calculate the total sensitivity indices of Sobol by the Monte-Carlo method. We demonstrate the application of this method to one site of study and to one reference group near the nuclear research Center of Cadarache (France), for two radionuclides: iodine 129 and uranium 238. It is thus shown that the most influential parameters are all related to the food chain of the goat's milk, in decreasing order of importance: dose coefficient “effective ingestion”, goat's milk ration of the individuals of the reference group, grass ration of the goat, dry deposition velocity and transfer factor to the goat's milk.  相似文献   

4.
The literature in economics, finance, operations research, engineering and in general mathematics is first reviewed on the subject of defining uncertainty and risk. The review goes back to 1901. Different perspectives on uncertainty and risk are examined and a new paradigm to model uncertainty and risk is proposed using relevant ideas from this study. This new paradigm is used to represent, aggregate and propagate uncertainty and interpret the resulting variability in a challenge problem developed by Oberkampf et al. [2004, Challenge problems: uncertainty in system response given uncertain parameters. Reliab Eng Syst Safety 2004; 85(1): 11-9]. The challenge problem is further extended into a decision problem that is treated within a multicriteria decision making framework to illustrate how the new paradigm yields optimal decisions under uncertainty. The accompanying risk is defined as the probability of an unsatisfactory system response quantified by a random function of the uncertainty.  相似文献   

5.
Computer simulation of the dynamic evolution of complex systems has become a fundamental tool for many modern engineering activities. In particular, risk-informed design projects and safety analyses require that the system behavior be analyzed under several diverse conditions in the presence of substantial model and parameter uncertainty which must be accounted for. In this paper we investigate the capabilities of artificial neural networks of providing both a first-order sensitivity measure of the importance of the various parameters of a model and a fast, efficient tool for dynamic simulation, to be used in uncertainty analyses. The dynamic simulation of a steam generator is considered as a test-bed to show the potentialities of these tools and to point out the difficulties and crucial issues which typically arise when attempting to establish an efficient neural network structure for sensitivity and uncertainty analyses.  相似文献   

6.
The phenomenon of aerodynamic instability caused by wind is usually a major design criterion for long-span cable-supported bridges. If the wind speed exceeds the critical flutter speed of the bridge, this constitutes an Ultimate Limit State. The prediction of the flutter boundary therefore requires accurate and robust models. The state-of-the-art theory concerning determination of the flutter stability limit is presented. Usually bridge decks are bluff and therefore the aeroelastic forces under wind action have to be experimentally evaluated in wind tunnels or numerically computed through Computational Fluid Dynamics (CFD) simulations. The self-excited forces are modelled using aerodynamic derivatives obtained through CFD forced vibration simulations on a section model. The two-degree-of-freedom flutter limit is computed by solving the Eigenvalue problem.A probabilistic flutter analysis utilizing a meta-modelling technique is used to evaluate the effect of parameter uncertainty. A bridge section is numerically modelled in the CFD simulations. Here flutter derivatives are considered as random variables. A methodology for carrying out sensitivity analysis of the flutter phenomenon is developed. The sensitivity with respect to the uncertainty of flutter derivatives and structural parameters is considered by taking into account the probability distribution of the flutter limit. A significant influence on the flutter limit is found by including uncertainties of the flutter derivatives due to different interpretations of scatter in the CFD simulations. The results indicate that the proposed probabilistic flutter analysis provides extended information concerning the accuracy in the prediction of flutter limits.The final aim is to set up a method to estimate the flutter limit with probabilistic input parameters. Such a tool could be useful for bridge engineers at early design stages. This study shows the difficulties in this regard which have to be overcome but also highlights some interesting and promising results.  相似文献   

7.
The following techniques for uncertainty and sensitivity analysis are briefly summarized: Monte Carlo analysis, differential analysis, response surface methodology, Fourier amplitude sensitivity test, Sobol' variance decomposition, and fast probability integration. Desirable features of Monte Carlo analysis in conjunction with Latin hypercube sampling are described in discussions of the following topics: (i) properties of random, stratified and Latin hypercube sampling, (ii) comparisons of random and Latin hypercube sampling, (iii) operations involving Latin hypercube sampling (i.e. correlation control, reweighting of samples to incorporate changed distributions, replicated sampling to test reproducibility of results), (iv) uncertainty analysis (i.e. cumulative distribution functions, complementary cumulative distribution functions, box plots), (v) sensitivity analysis (i.e. scatterplots, regression analysis, correlation analysis, rank transformations, searches for nonrandom patterns), and (vi) analyses involving stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty.  相似文献   

8.
Variable screening and ranking using sampling-based sensitivity measures   总被引:12,自引:0,他引:12  
This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables.  相似文献   

9.
对气相色谱法测定肉品中六六六、滴滴涕残留量的不确定度进行分析,找出影响不确定度的因素,以及它们的相互关系,建立不确定度评估简易模型,以便适用、有效、快捷地评估此类分析的测量不确定度。并通过对各个不确定度分量的分析计算,然后计算出合成不确定度和扩展不确定度。这一不确定度分析模型简单、易懂、有效,有较强的参考价值。  相似文献   

10.
The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.  相似文献   

11.
The potential danger posed to human health from pesticides and herbicides has been a growing national concern due to the increased frequency of agrochemical residues found in food and water. It is becoming critical to determine the concentration in all environmental media for a complete picture of potential human exposure. A multimedia transport model is used to determine the concentration of atrazine in surface water, ground water, surface soil, root zone soil, plants, and air at a typical mid-western location. A range of values is used for each model input, resulting in a distribution of possible concentrations in each medium. A sensitivity analysis determines the influence each parameter has on the outcome variance for each environmental media concentration. The concentrations determined for ground and surface water are then compared to measured concentrations in the region to validate the model. The concentrations are then compared to measured concentrations in the region to validate the model. A companion paper utilizes these concentrations and translates them into human exposure and risk.  相似文献   

12.
Estimates of failure rates for nuclear power plant piping systems are important inputs to Probabilistic Risk Assessments (PRA) and risk informed applications of PRA. Such estimates are needed for initiating event frequencies for Loss of Coolant Accidents and internal flooding events and for risk informed evaluations of piping system in-service inspection programs. A critical issue in the estimation of these parameters is the treatment of uncertainties, which can exceed an order of magnitude deviation from failure rate point estimates. Sources of uncertainty include failure data reporting issues, scarcity of data, poorly characterized component populations, and uncertainties about the physical characteristics of the failure mechanisms and root causes. A methodology for quantifying these uncertainties using a Bayes' uncertainty analysis method was developed for the EPRI risk informed in-service inspection program and significantly enhanced in subsequent applications. In parallel with these efforts, progress has been made in the development of pipe failure databases that contain the quantity and quality of information needed to support piping system reliability evaluations. Examples are used in this paper to identify technical issues with previous published estimates of pipe failure rates and the numerical impacts of these issues on the pipe failure rates and rupture frequencies are quantified.  相似文献   

13.
Uncertainty and sensitivity analysis results obtained with random and Latin hypercube sampling are compared. The comparison uses results from a model for two-phase fluid flow obtained with three independent random samples of size 100 each and three independent Latin hypercube samples (LHSs) of size 100 each. Uncertainty and sensitivity analysis results with the two sampling procedures are similar and stable across the three replicated samples. Poor performance of regression-based sensitivity analysis procedures for some analysis outcomes results more from the inappropriateness of the procedure for the nonlinear relationships between model input and model results than from an inadequate sample size. Kendall's coefficient of concordance (KCC) and the top down coefficient of concordance (TDCC) are used to assess the stability of sensitivity analysis results across replicated samples, with the TDCC providing a more informative measure of analysis stability than KCC. A new sensitivity analysis procedure based on replicated samples and the TDCC is introduced.  相似文献   

14.
While the secrecy of real water distribution system data is crucial, it poses difficulty for research as results cannot be publicized. This data includes topological layouts of pipe networks, pump operation schedules, and water demands. Therefore, a library of virtual water distribution systems can be an important research tool for comparative development of analytical methods. A virtual city, “Micropolis”, has been developed, including a comprehensive water distribution system, as a first entry into such a library. This virtual city of 5000 residents is fully described in both geographic information systems (GIS) and EPANet hydraulic model frameworks. A risk classification scheme and Monte Carlo analysis are employed for an attempted water supply contamination attack. Model inputs to be considered include uncertainties in: daily water demand, seasonal demand, initial storage tank levels, the time of day a contamination event is initiated, duration of contamination event, and contaminant quantity. Findings show that reasonable uncertainties in model inputs produce high variability in exposure levels. It is also shown that exposure level distributions experience noticeable sensitivities to population clusters within the contaminant spread area. High uncertainties in exposure patterns lead to greater resources needed for more effective mitigation strategies.  相似文献   

15.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

16.
Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made.This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network.This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.  相似文献   

17.
针对输入电阻的测量误差常常偏大的问题,分析了物理机理和实际原因。讨论了影响输入电阻测量不确定度的几个主要因素,包括标准电压、测量电压、标准电阻取值及误差的影响等,给出了降低输入电阻测量不确定度的主要措施,即标准电阻与输入电阻的量值越接近越好,两者应处于同一数量级,并且两个不同的激励电压差异应该尽量大。通过实例给出了输入电阻测量不确定度的评价结果,表明了该过程的正确性和切实可行性。  相似文献   

18.
Since 1998 bluetongue virus (BTV), which causes bluetongue, a non-contagious, insect-borne infectious disease of ruminants, has expanded northwards in Europe in an unprecedented series of incursions, suggesting that there is a risk to the large and valuable British livestock industry. The basic reproduction number, R(0), provides a powerful tool with which to assess the level of risk posed by a disease. In this paper, we compute R(0) for BTV in a population comprising two host species, cattle and sheep. Estimates for each parameter which influences R(0) were obtained from the published literature, using those applicable to the UK situation wherever possible. Moreover, explicit temperature dependence was included for those parameters for which it had been quantified. Uncertainty and sensitivity analyses based on Latin hypercube sampling and partial rank correlation coefficients identified temperature, the probability of transmission from host to vector and the vector to host ratio as being most important in determining the magnitude of R(0). The importance of temperature reflects the fact that it influences many processes involved in the transmission of BTV and, in particular, the biting rate, the extrinsic incubation period and the vector mortality rate.  相似文献   

19.
Over the last two decades a growing interest for risk analysis has been noted in the industries. The ARAMIS project has defined a methodology for risk assessment. This methodology has been built to help the industrialist to demonstrate that they have a sufficient risk control on their site.

Risk analysis consists first in the identification of all the major accidents, assuming that safety functions in place are inefficient. This step of identification of the major accidents uses bow–tie diagrams. Secondly, the safety barriers really implemented on the site are taken into account. The barriers are identified on the bow–ties. An evaluation of their performance (response time, efficiency, and level of confidence) is performed to validate that they are relevant for the expected safety function. At last, the evaluation of their probability of failure enables to assess the frequency of occurrence of the accident. The demonstration of the risk control based on a couple gravity/frequency of occurrence is also possible for all the accident scenarios.

During the risk analysis, a practical tool called risk graph is used to assess if the number and the reliability of the safety functions for a given cause are sufficient to reach a good risk control.  相似文献   


20.
Currently, comparison between countries in terms of their road safety performance is widely conducted in order to better understand one's own safety situation and to learn from those best-performing countries by indicating practical targets and formulating action programmes. In this respect, crash data such as the number of road fatalities and casualties are mostly investigated. However, the absolute numbers are not directly comparable between countries. Therefore, the concept of risk, which is defined as the ratio of road safety outcomes and some measure of exposure (e.g., the population size, the number of registered vehicles, or distance travelled), is often used in the context of benchmarking. Nevertheless, these risk indicators are not consistent in most cases. In other words, countries may have different evaluation results or ranking positions using different exposure information. In this study, data envelopment analysis (DEA) as a performance measurement technique is investigated to provide an overall perspective on a country's road safety situation, and further assess whether the road safety outcomes registered in a country correspond to the numbers that can be expected based on the level of exposure. In doing so, three model extensions are considered, which are the DEA based road safety model (DEA-RS), the cross-efficiency method, and the categorical DEA model. Using the measures of exposure to risk as the model's input and the number of road fatalities as output, an overall road safety efficiency score is computed for the 27 European Union (EU) countries based on the DEA-RS model, and the ranking of countries in accordance with their cross-efficiency scores is evaluated. Furthermore, after applying clustering analysis to group countries with inherent similarity in their practices, the categorical DEA-RS model is adopted to identify best-performing and underperforming countries in each cluster, as well as the reference sets or benchmarks for those underperforming ones. More importantly, the extent to which each reference set could be learned from is specified, and practical yet challenging targets are given for each underperforming country, which enables policymakers to recognize the gap with those best-performing countries and further develop their own road safety policy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号