首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 29 毫秒
1.
It is often said that the aim of risk assessment is to faithfully represent and report the knowledge of some defined experts in the field studied. The analysts' job is to elicit this knowledge, synthesise it and report the results as integrated uncertainty assessments, for example, expressed through a set of probability distributions. Analysts' judgements beyond these tasks should not be incorporated in the uncertainty assessments (distributions). The purpose of the present paper is to discuss the rationale of this perspective. To conduct a risk assessment in practice the analysts need to make a number of judgements related to, for example, the choice of methods and models that to a large extent influence the results. And often the analysts are the real experts on many of the issues addressed in the assessments, in particular, when it comes to understanding how various phenomena and processes interact. Would it then not be more appropriate to fully acknowledge the role of the analysts as uncertainty assessors and probability assigners, and see the results of the risk assessments as their judgements based on input from the experts? The discussion is illustrated by two examples.  相似文献   

2.
3.
mode and effects analysis (FMEA) is an effective tool to assess the risk of a system or process under uncertain environment. However, how to handle the uncertainty in the subjective assessment is an open issue. In this paper, a novel method to deal with the uncertainty coming from subjective assessments of FMEA experts is proposed in the framework of Dempster–Shafer evidence theory. First, the uncertain degree of the assessment is measured by the ambiguity measure. Then, the uncertainty is transformed to the reliability of each FMEA expert and the relative importance of each risk factor. After that, the assessments from FMEA team will be fused with a discounting-based combination rule to address the potential conflict. Moreover, to avoid the situation that different risk priorities of failure modes may have the same ranking based on classical risk priority number method, the gray relational projection method (GRPM) is adopted for ranking risk priorities of failure modes. Finally, an application of the improved FMEA model in sheet steel production process verifies the reliability and validity of the proposed method.  相似文献   

4.
Equivalent and effective dose are protection quantities defined by the The International Commission on Radiological Protection (ICRP). They are frequently referred to simply as dose and may be misused. They provide a method for the summation of doses received from external sources and from intakes of radionuclides for comparison with dose limits and constraints, set to limit the risk of cancer and hereditary effects. For the assessment of internal doses, ICRP provides dose coefficients (Sv Bq(-1)) for the ingestion or inhalation of radionuclides by workers and members of the public, including children. Dose coefficients have also been calculated for in utero exposures following maternal intakes and for the transfer of radionuclides in breast milk. In each case, values are given of committed equivalent doses to organs and tissues and committed effective dose. Their calculation involves the use of defined biokinetic and dosimetric models, including the use of reference phantoms representing the human body. Radiation weighting factors are used as a simple representation of the different effectiveness of different radiations in causing stochastic effects at low doses. A single set of tissue weighting factors is used to take account of the contribution of individual organs and tissues to overall detriment from cancer and hereditary effects, despite age- and gender-related differences in estimates of risk and contributions to risk. The results are quantities that are not individual specific but are reference values for protection purposes, relating to doses to phantoms. The ICRP protection quantities are not intended for detailed assessments of dose and risk to individuals. They should not be used in epidemiological analyses or the assessment of the possibility of occurrence and severity of tissue reactions (deterministic effects) at higher doses. Dose coefficients are published as reference values and as such have no associated uncertainty. Assessments of uncertainties may be appropriate in specific analyses of doses and risks and in epidemiological studies.  相似文献   

5.
New requirements and regulations have increased the pressure on companies to provide information on their products. This is challenging for small- and medium-sized enterprises (SMEs) since they lack both expertise and resources. In this paper, the possibilities to develop environmental product declarations (EPDs) for products with use of data-assistant tools are explored. A case study of furniture production in Norway is used to exemplify this. A database with specific environmental data for materials used in furniture has been developed. The database is used to conduct the life cycle assessment (LCA) for selected products and is the backbone of a data-assistance tool used to design and present the EPDs. Five key performance indicators are selected. The database and these KPIs ensures standardised assessments of products that enables both comparison of existing products as well as assessment of environmental performance of redesigned products and potential new products. This paper shows how this enables the SMEs to both provide environmental performance information to stakeholders as well as enables them to identify possible improvements with limited resources and competence on environmental performance and LCAs.  相似文献   

6.
Risk analysis is a tool for investigating and reducing uncertainty related to outcomes of future activities. Probabilities are key elements in risk analysis, but confusion about interpretation and use of probabilities often weakens the message from the analyses. Under the predictive, epistemic approach to risk analysis, probabilities are used to express uncertainty related to future values of observable quantities like the number of fatalities or monetary loss in a period of time. The procedure for quantifying this uncertainty in terms of probabilities is, however, not obvious. Examples of topics from the literature relevant in this discussion are use of expert judgement, the effect of so-called heuristics and biases, application of historical data, dependency and updating of probabilities. The purpose of this paper is to discuss and give guidelines on how to quantify uncertainty in the perspective of these topics. Emphasis is on the use of models and assessment of uncertainties of similar quantities.  相似文献   

7.
The quality evaluation and assessment of radiological data is the final step in the overall environmental data decisionprocess. This quality evaluation and assessment process is performed outside of the laboratory, and generally the radiochemist is not involved. However, with the laboratory quality management systems in place today, the data packages of radiochemical analyses are frequently much more complex than the project/program manager can effectively handle and additionally, with little involvement from radiochemists in this process, the potential for misinterpretation of radiological data is increasing. The quality evaluation and assessment of radiochemistry data consists of making three decisions for each sample and result, remembering that the laboratory reports all the data for each analyses as well as the uncertainty in each of these analyses. Therefore, at the data evaluation and assessment stage, the decisions are: (1) is the radionuclide of concern detected (each data point always has a number associated with it?); (2) is the uncertainty associated with the result greater than would normally be expected; and (3) if the laboratory rejected the analyses is there serious consequences to other samples in the same group? The need for the radiochemist's expertise for this process is clear. Quality evaluation and assessment requires the input of the radiochemist particularly in radiochemistry because of the lack of redundancy in the analytical data. This paper describes the role of the radiochemist in the quality assessment of radiochemical data for environmental decision making.  相似文献   

8.
Modeling uncertainty during risk assessment is a vital component for effective decision making. Unfortunately, most of the risk assessment studies suffer from uncertainty analysis. The development of tools and techniques for capturing uncertainty in risk assessment is ongoing and there has been a substantial growth in this respect in health risk assessment. In this study, the cross-disciplinary approaches for uncertainty analyses are identified and a modified approach suitable for industrial safety risk assessment is proposed using fuzzy set theory and Monte Carlo simulation. The proposed method is applied to a benzene extraction unit (BEU) of a chemical plant. The case study results show that the proposed method provides better measure of uncertainty than the existing methods as unlike traditional risk analysis method this approach takes into account both variability and uncertainty of information into risk calculation, and instead of a single risk value this approach provides interval value of risk values for a given percentile of risk. The implications of these results in terms of risk control and regulatory compliances are also discussed.  相似文献   

9.
The use of risk assessment in the nuclear industry began in the 1970s as a complementary approach to the deterministic methods used to assess the safety of nuclear facilities. As experience with the theory and application of probabilistic methods has grown, so too has its application. In the last decade, the use of probabilistic safety assessment has become commonplace for all phases of the life of a plant, including siting, design, construction, operation and decommissioning. In the particular case of operation of plant, the use of a ‘living’ safety case or probabilistic safety assessment, building upon operational experience, is becoming more widespread, both as an operational tool and as a basis for communication with the regulator. In the case of deciding upon a site for a proposed reactor, use is also being made of probabilistic methods in defining the effect of design parameters. Going hand in hand with this increased use of risk based methods has been the development of assessment criteria against which to judge the results being obtained from the risk analyses. This paper reviews the use of risk assessment in the light of the need for acceptability criteria and shows how these tools are applied in the Australian nuclear industry, with specific reference to the probabilistic safety assessment (PSA) performed of HIFAR.  相似文献   

10.
Integrating human health and ecological concerns in risk assessments   总被引:4,自引:0,他引:4  
The interconnections between ecosystems, human health and welfare have been increasingly recognized by the US government, academia, and the public. This paper continues this theme by addressing the use of risk assessment to integrate people into a single assessment. In a broad overview of the risk assessment process we stress the need to build a conceptual model of the whole system including multiple species (humans and other ecological entities), stressors, and cumulative effects. We also propose converging landscape ecology and evaluation of ecosystem services with risk assessment to address these cumulative responses. We first look at how this integration can occur within the problem formulation step in risk assessment where the system is defined, a conceptual model created, a subset of components and functions selected, and the analytical framework decided in a context that includes the management decisions. A variety of examples of problem formulations (salmon, wild insects, hyporheic ecosystems, ultraviolet (UV) radiation, nitrogen fertilization, toxic chemicals, and oil spills) are presented to illustrate how treating humans as components of the landscape can add value to risk assessments. We conclude that the risk assessment process should help address the urgent needs of society in proportion to importance, to provide a format to communicate knowledge and understanding, and to inform policy and management decisions.  相似文献   

11.
Transboundary impact assessment (TIA) has become an important environmental management tool, particularly where a project may have transboundary impacts. With the growing practice of TIA, it becomes important to consider the accuracy of the transboundary impact assessments that are being conducted. If TIA is a planning tool designed to provide a basis for making an informed decision, does it actually provide the necessary information? This paper summarizes lessons learned in pilot-testing a methodology to assess the accuracy of TIAs.  相似文献   

12.
This paper reviews the use of offsets in the Western Australian (WA) environmental impact assessment (EIA) process. First, an overview and analysis of offsets as a policy tool in EIA is provided, noting that it has its origins in the practice of ecological restoration and uses the principle of the no net loss (NNL). Second, the implementation of WA offset policy is discussed noting the emergence of two new types of offsets in response to the uncertainty associated with some major resource projects. The first type is a ‘residual risk’ offset, which is provided in recognition of the uncertain risks associated with the proposal. The second type is a ‘banked’ offset, which is called upon only in the event that negative environmental impacts occur. Finally, it is proposed that these ‘offsets for uncertainty’ could be used more broadly in EIA where there is significant uncertainty of impacts provided the residual risk is considered acceptable.  相似文献   

13.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty.  相似文献   

14.
Growing concern about the risk of major chemical accidents in the USA has led both government and industry to find new ways to identify and evaluate potential hazards. Among the most promising (and misunderstood) approaches is a collection of techniques called quantitative risk assessment (QRA). Adapted primarily from probabilistic risk assessment approaches developed in other industries, the use of QRA is spreading rapidly through the US chemical industry. Of equal importance, legislators and regulatory agencies at the state and federal level are embracing QRA as part of their proposals for mandatory accident prevention measures.The Chemical Manufactures Association (CMA) and its member companies recognized the need to provide management personnel with a guide to QRA. Chemical process industry (CPI) managers need criteria for determining when risk assessment will provide information that will aid their decision making. Executives need help in understanding and evaluating QRA results that are often inscrutable to nonexperts, and CPI managers need advice concerning how detailed an analysis must be if it is to provide adequate information for a specific decision.JBF Associates, Inc., assisted by the Process Safety Analysis Task Group of CMA, prepared A Manager's Guide to Quantitative Risk Assessment (Arendt, J.S. et al., CMA, 1989). This paper gives an overview of the Guide and discusses important implications concerning the increasing acceptance of QRA as a chemical regulatory tool.  相似文献   

15.
Collaborative robots are an emerging technology falling within the scope of Industry 4.0 and based on the concept of Human-Robot Collaboration (HRC). Unlike traditional industrial robots, collaborative robots are used in shared workspaces with no safety fences. Hence, prospective hazardous contacts need to be avoided or mitigated through a risk assessment. The normative standards such as ISO TS 15066 suggest a list of common hazards, but do not guide the robot system user through the risk assessment process. To address this shortcoming, this paper proposes a practical eight steps risk assessment approach, resulting in a risk priority list. In order to provide an accurate, practical, quantitative and supportive tool for HRC environments, the Failure Mode and Effects Analysis (FMEA) and the Proportional Risk Assessment technique (PRAT) techniques are proposed for risk assessment. The two techniques mentioned above are combined in the suggested new methodology, highlighting both their benefits and disadvantages. The proposed methodology is applied with positive results to a collaborative brick-lifter case study.  相似文献   

16.
In quantitative risk analysis (QRA) risk is quantified using probabilities and expected values, for example expressed by PLL values, FAR values, IR values and FN curves. The calculations are tedious and include a strong element of arbitrariness. The value added by the quantification can certainly be questioned. In this paper, we argue that such analyses often are better replaced by semi-quantitative analyses, highlighting assessments of hazards and barriers, risk influencing factors (RIFs) and safety improvement measures. The assessments will be based on supporting information produced by risk analysts, including hard data and analyses of failure causes and mechanisms, barrier performance, scenario development, etc. The approach acknowledges that risk cannot be adequately described and evaluated simply by reference to summarising probabilities and expected values. There is a need for seeing beyond the standard probabilistic risk results of a QRA. Key aspects to include are related to uncertainties in phenomena and processes, and manageability factors. Such aspects are often ignored in standard QRAs.  相似文献   

17.
Safety assessment based on conventional tools (e.g. probability risk assessment (PRA)) may not be well suited for dealing with systems having a high level of uncertainty, particularly in the feasibility and concept design stages of a maritime or offshore system. By contrast, a safety model using fuzzy logic approach employing fuzzy IF–THEN rules can model the qualitative aspects of human knowledge and reasoning processes without employing precise quantitative analyses. A fuzzy-logic-based approach may be more appropriately used to carry out risk analysis in the initial design stages. This provides a tool for working directly with the linguistic terms commonly used in carrying out safety assessment. This research focuses on the development and representation of linguistic variables to model risk levels subjectively. These variables are then quantified using fuzzy sets. In this paper, the development of a safety model using fuzzy logic approach for modelling various design variables for maritime and offshore safety based decision making in the concept design stage is presented. An example is used to illustrate the proposed approach.  相似文献   

18.
Uncertainty quantification and risk assessment in the optimal design of structural systems has always been a critical consideration for engineers. When new technologies are developed or implemented and budgets are limited for full-scale testing, the result is insufficient datasets for construction of probability distributions. Making assumptions about these probability distributions can potentially introduce more uncertainty to the system than it quantifies. Evidence theory represents a method to handle epistemic uncertainty that represents a lack of knowledge or information in the numerical optimization process. Therefore, it is a natural tool to use for uncertainty quantification and risk assessment especially in the optimization design cycle for future aerospace structures where new technologies are being applied. For evidence theory to be recognized as a useful tool, it must be efficiently applied in a robust design optimization scheme. This article demonstrates a new method for projecting the reliability gradient, based on the measures of belief and plausibility, without gathering any excess information other than what is required to determine these measures. This represents a huge saving in computational time over other methods available in the current literature. The technique developed in this article is demonstrated with three optimization examples.  相似文献   

19.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

20.
Information about present and anticipated bridge reliabilities, in conjunction with decision models, provides a rational and powerful decision-making tool for the structural assessment of bridges. For assessment purposes, an updated reliability (after an inspection) may be used for comparative or relative risk purposes. This may include the prioritisation of risk management measures (risk ranking) for inspection, maintenance, repair or replacement. A life-cycle cost analysis may also be used to quantify the expected cost of a decision. The present paper will present a broad overview of the concepts, methodology and immediate applications of risk-based assessments of bridges. In particular, two practical applications of reliability-based bridge assessment are considered — risk ranking and life-cycle cost analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号