首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell–Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA.  相似文献   

2.
Recent works [Epstein S, Rauzy A. Can we trust PRA? Reliab Eng Syst Safety 2005; 88:195–205] have questioned the validity of traditional fault tree/event tree (FTET) representation of probabilistic risk assessment problems. In spite of whether the risk model is solved through FTET or binary decision diagrams (BDDs), importance measures need to be calculated to provide risk managers with information on the risk/safety significance of system structures and components (SSCs). In this work, we discuss the computation of the Fussel–Vesely (FV), criticality, Birnbaum, risk achievement worth (RAW) and differential importance measure (DIM) for individual basic events, basic event groups and components. For individual basic events, we show that these importance measures are linked by simple relations and that this enables to compute basic event DIMs both for FTET and BDD codes without additional model runs. We then investigate whether/how importance measures can be extended to basic event groups and components. Findings show that the estimation of a group Birnbaum or criticality importance is not possible. On the other hand, we show that the DIM of a group or of a component is exactly equal to the sum of the DIMs of the corresponding basic events and can therefore be found with no additional model runs. The above findings hold for both the FTET and the BDD methods.  相似文献   

3.
In the current quantification of fire probabilistic risk assessment (PRA), when components are damaged by a fire, the basic event values of the components become ‘true’ or one (1), which removes the basic events related to the components from the minimal cut sets, and which makes it difficult to calculate accurate component importance measures. A new method to accurately calculate an importance measure such as Fussell-Vesely in fire PRA is introduced in this paper. Also, a new quantification algorithm in the fire PRA model is proposed to support the new calculation method of the importance measures. The effectiveness of the new method in finding the importance measures is illustrated with an example of evaluating cables’ importance.  相似文献   

4.
Existing measures of the risk significance of elements of risk models (such as the Fussell–Vesely, or ‘F–V’, importance of basic events) are based on the properties of cut sets containing the element. A measure of safety significance (prevention worth, or ) is proposed, based on the properties of path sets containing the element. A high value of F–V means that cut sets containing the element contribute significantly to top event frequency; a high value of means that path sets containing the element contribute significantly to top event prevention. The properties of as a measure of basic event significance are illustrated first with a simple block diagram example, and then with an example based on nuclear power plant risk models. can also be understood as a property of a set of success scenarios, and as such, can be applied more broadly than just as a measure of element significance.  相似文献   

5.
In a fault tree analysis, an uncertainty importance measure is used to identify those basic events that significantly contribute to the uncertainty of the top-event probability. This paper defines an uncertainty importance measure of a basic event or of a group of basic events, and develops a two-stage procedure for experimentally evaluating the measure under the assumption that the probability of each basic event follows a lognormal distribution. The proposed method utilizes the Taguchi tolerance design technique with modifications. Then, the so-called contribution ratios which evaluate the main and/or interaction effects of the uncertainties of log-transformed basic-event probabilities on the uncertainty of the log-transformed top-event probability are calculated. The contribution ratios are used to estimate the defined uncertainty importance measure of a basic event or of a group of basic events. The proposed method consists of two stages for computational efficiency. In the first stage, the basic events with negligible effects on the uncertainty of the log-transformed top-event probability are screened out, and more detailed analyses are conducted in the second stage with a substantially smaller number of basic events. In addition, this paper presents an analysis method to quantify the percentage reduction in the uncertainty of the log-transformed top-event probability when the uncertainty of each basic-event probability is reduced.  相似文献   

6.
A new importance measure for risk-informed decision making   总被引:1,自引:0,他引:1  
In this paper, we introduce a new importance measure, the differential importance measure (DIM), for probabilistic safety assessment (PSA). DIM responds to the need of the analyst/decision maker to get information about the importance of proposed changes that affect component properties and multiple basic events. DIM is directly applicable to both the basic events and the parameters of the PSA model. Unlike the Fussell–Vesely (FV), risk achievement worth (RAW), Birnbaum, and criticality importance measures, DIM is additive, i.e. the DIM of groups of basic events or parameters is the sum of the individual DIMs. We discuss the difference between DIM and other local sensitivity measures that are based on normalized partial derivatives. An example is used to demonstrate the evaluation of DIM at both the basic event and the parameter level. To compare the results obtained with DIM at the parameter level, an extension of the definitions of FV and RAW is necessary. We discuss possible extensions and compare the results of the three measures for a more realistic example.  相似文献   

7.
Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events – age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S) – in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.  相似文献   

8.
This paper presents a vital area identification method based on the current probabilistic safety assessment (PSA) techniques. The vital area identification method in this paper is focused on core melt rather than radioactive material release. Furthermore, it describes a conceptual framework with which the risk from sabotage-induced events could be assessed.Location minimal cut sets (MCSs) are evaluated after developing a core melt location fault tree (LFT). LFT is a fault tree whose basic events are sabotage-induced damages on the locations within which various safety-related components are located. The core melt LFT is constructed by combining all sequence LFTs of various event trees with OR gates. Each sequence LFT is constructed by combining the initiating event LFT and the mitigating event LFTs with an AND gate. The vital area could be identified by using the location importance measures on the core melt location MCSs. An application was made to a typical 1000 MWe pressurized water reactor power plant located at the Korean seashore.The methodology suggested in the present paper is believed to be very consistent and most complete in identifying the vital areas in a nuclear power plant because it is based on the well-proven PSA technology.  相似文献   

9.
Components' importance measures play a very important role in system reliability analysis. They are used to identify the weakest parts of the system for design improvement, failure diagnosis and maintenance. This paper deals with the problem of determining the importance measures of basic events in case of unreliability analysis of binary coherent and non-coherent fault trees. This type of analysis is typical of catastrophic top events, characterised by unacceptable consequences. Since the unreliability of systems with repairable components cannot be exactly calculated via fault tree, the Expected Number of Failures - which is obtained by integrating the unconditional failure frequency - is considered as it represents a good upper bound. In these cases it is important to classify events as initiators or enablers since their roles in the system are different, their sequence of occurrence is different and consequently they must be treated differently. New equations based on system failure frequency are described in this paper for determining the exact importance measures of initiating and enabling events. Simple examples are provided to clarify the application of the proposed calculation methods. Compared with the exact methods available in the literature, those proposed in this paper are easier to apply by hand and are simpler to implement in a fault tree analyser.  相似文献   

10.
Steam generator tube ruptures (SGTRs) at pressurized water reactors are identified as one of the risk significant events in the probabilistic risk assessment studies. In addition, operating experience indicates that SGTRs might result in complex transients, some of which involved additional anomalies such as the delayed recognition of event, resulting in the retarded isolation of the ruptured SG and/or failure to timely equalize the primary and secondary pressures. In order to identify the risk significant anomalies and to obtain generic insights useful for examining alternative mitigation measures for STGR, the present study systematically analyzed ten actual and one potential SGTR events using an accident sequence precursor model consistently. The analysis results show that the SGTR event involving the delayed identification of tube rupture or failure to timely depressurize the reactor could have a relatively high possibility of leading to core damage and would be a significant precursor, though the models and failure probabilities applied might be further examined. This implies the importance of improving the capability to detect SGTR and the operating procedures. It is also shown that some of the other anomalies observed would largely contribute to the possibility of core damage, which points out the need of examining alternative measures for recovering from such conditions.  相似文献   

11.
王浩  何中其  朱益 《爆破器材》2019,48(4):60-64
真空干燥是硝铵炸药生产过程中一道容易发生燃爆事故的重要工序。为研究硝铵炸药真空干燥过程中发生燃爆事故的原因及机理,通过事故案例分析和现场调研,确定了导致燃爆事故的各个基本事件及其逻辑关系,并由此构建以燃爆事故作为顶事件的事故树。采用布尔代数化简事故树,得到87个最小割集和9个最小径集,结果显示每个最小径集包含的基本事件都较多,说明真空干燥工艺安全性较低。通过计算各基本事件的结构重要度并排序,得到结构重要度较大的基本事件,由此推断出导致燃爆事故的主要基本事件,并有针对性地提出相应的改进措施与建议,为企业的安全生产提供参考。  相似文献   

12.
We construct a model for living probabilistic safety assessment (PSA) by applying the general framework of marked point processes. The framework provides a theoretically rigorous approach for considering risk follow-up of posterior hazards. In risk follow-up, the hazard of core damage is evaluated synthetically at time points in the past, by using some observed events as logged history and combining it with re-evaluated potential hazards. There are several alternatives for doing this, of which we consider three here, calling them initiating event approach, hazard rate approach, and safety system approach. In addition, for a comparison, we consider a core damage hazard arising in risk monitoring. Each of these four definitions draws attention to a particular aspect in risk assessment, and this is reflected in the behaviour of the consequent risk importance measures. Several alternative measures are again considered. The concepts and definitions are illustrated by a numerical example.  相似文献   

13.
Investigations have shown that the consequences from fires in nuclear power plants can be significant. Methodologies considering fire in probabilistic safety analyses have been evolving in the last few years. In order to provide a basis for further discussions on benefits and limits of such an analysis in Germany, current methods are investigated. As a result a qualitative screening process is proposed to identify critical fire zones followed by a quantitative event tree analysis in which the fire caused frequency of initiating events and different core damage states will be determined. The models and data proposed for a probabilistic fire risk analysis have been successfully applied in complete and partial fire risk assessments in German nuclear power plants.  相似文献   

14.
A fault tree diagnosis methodology which can locate the actual MCS (minimum cut set) in the system in a minimum number of inspections is presented. An entropy function is defined to estimate the information uncertainty at a stage of diagnosis and is chosen as an objective function to be minimized. Inspection which can provide maximal information should be chosen because it can minimize the information uncertainty and will, on average, lead to the discovery of the actual MCS in a minimum number of subsequent inspections. The result reveals that, contrary to what is suggested by traditional diagnosis methodology based on probabilistic importance, inspection on a basic event whose Fussell-Vesely importance is nearest to 0·5 best distinguishes the MCSs.  相似文献   

15.
This paper outlines the quantitative risk assessment for storage and purification section of a titanium sponge production facility. Based on qualitative HAZAN technique, which involves a detailed FETI and HAZOP study of the entire plant, the storage and the purification section were found to be the most hazardous sections. Titanium tetrachloride (TiCl(4)) is the major reactant used in this plant. TiCl(4) is a toxic, corrosive water reactive chemical and on spillage from containment creates a liquid pool that can either boil or evaporate leading to the evolution of toxic hydrogen chloride (HCl). Fault tree analysis technique has been used to identify the basic events responsible for the top event occurrence and calculate their probabilities. Consequence analysis of the probable scenarios has been carried out and the risk has been estimated in terms of fatality and injuries. These results form the basic inputs for the risk management decisions.  相似文献   

16.
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.  相似文献   

17.
A new uncertainty importance measure   总被引:19,自引:0,他引:19  
Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22–33] first introduced uncertainty importance measures.  相似文献   

18.
数控磨床砂轮架系统故障树分析中的一大难点是确定基本事件的发生概率,基本事件的发生情况存在模糊性且由于时间和成本的限制往往无法通过实验获得足够的可靠性数据。为了解决这一问题,引入模糊集合论,用梯形模糊数来描述故障树分析中的基本事件和顶事件的发生概率。首先对数控磨床砂轮架系统的结构层次进行分析,建立砂轮架系统的故障树。然后以砂轮架系统主轴振动异响为例进行模糊故障树分析,求解顶事件发生概率的梯形模糊数;并类比传统故障树分析中"临界重要度"的概念,定义适用于模糊故障树分析的"模糊临界重要度"。最后根据求解的模糊临界重要度对基本事件进行排序,确定危害程度较高的基本事件,结果与企业的实际情况相符合。结果表明该方法能够有效解决数控磨床故障树分析中基本事件难于准确赋值的问题,为企业提高机械系统的可靠性提供了一种定量依据。  相似文献   

19.
This paper describes a practical method to accurately quantify top event probability and importance measures from incomplete minimal cut sets (MCS) of a large fault tree. The MCS-based fault tree method is extensively used in probabilistic safety assessments. Several sources of uncertainties exist in MCS-based fault tree analysis. The paper is focused on quantification of the following two sources of uncertainties: (1) the truncation neglecting low-probability cut sets and (2) the approximation in quantifying MCSs. The method proposed in this paper is based on a Monte Carlo simulation technique to estimate probability of the discarded MCSs and the sum of disjoint products (SDP) approach complemented by the correction factor approach (CFA). The method provides capability to accurately quantify the two uncertainties and estimate the top event probability and importance measures of large coherent fault trees. The proposed fault tree quantification method has been implemented in the CUTREE code package and is tested on the two example fault trees.  相似文献   

20.
Fault tree analysis is a method largely used in probabilistic risk assessment. Uncertainties should be properly handled in fault tree analyses to support a robust decision making. While many sources of uncertainties are considered, dependence uncertainties are not much explored. Such uncertainties can be labeled as ‘epistemic’ because of the way dependence is modeled. In practice, despite probability theory, alternative mathematical structures, including possibility theory and fuzzy set theory, for the representation of epistemic uncertainty can be used. In this article, a fuzzy β factor is considered to represent the failure dependence uncertainties among basic events. The relationship between β factor and system failure probability is analyzed to support the use of a hybrid probabilistic–possibilistic approach. As a result, a complete hybrid probabilistic–possibilistic framework is constructed. A case study of a high integrity pressure protection system is discussed. The results show that the proposed method provides decision makers a more accurate understanding of the system under analysis when failure dependencies are involved. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号