首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Recent works [Epstein S, Rauzy A. Can we trust PRA? Reliab Eng Syst Safety 2005; 88:195–205] have questioned the validity of traditional fault tree/event tree (FTET) representation of probabilistic risk assessment problems. In spite of whether the risk model is solved through FTET or binary decision diagrams (BDDs), importance measures need to be calculated to provide risk managers with information on the risk/safety significance of system structures and components (SSCs). In this work, we discuss the computation of the Fussel–Vesely (FV), criticality, Birnbaum, risk achievement worth (RAW) and differential importance measure (DIM) for individual basic events, basic event groups and components. For individual basic events, we show that these importance measures are linked by simple relations and that this enables to compute basic event DIMs both for FTET and BDD codes without additional model runs. We then investigate whether/how importance measures can be extended to basic event groups and components. Findings show that the estimation of a group Birnbaum or criticality importance is not possible. On the other hand, we show that the DIM of a group or of a component is exactly equal to the sum of the DIMs of the corresponding basic events and can therefore be found with no additional model runs. The above findings hold for both the FTET and the BDD methods.  相似文献   

2.
This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell–Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA.  相似文献   

3.
A limitation of the importance measures (IMs) currently used in reliability and risk analyses is that they rank only individual components or basic events whereas they are not directly applicable to combinations or groups of components or basic events. To partially overcome this limitation, recently, the differential importance measure (DIM), has been introduced for use in risk-informed decision making. The DIM is a first-order sensitivity measure that ranks the parameters of the risk model according to the fraction of total change in the risk that is due to a small change in the parameters’ values, taken one at a time. However, it does not account for the effects of interactions among components. In this paper, a second-order extension of the DIM, named DIMII, is proposed for accounting of the interactions of pairs of components when evaluating the change in system performance due to changes of the reliability parameters of the components. A numerical application is presented in which the informative contents of DIM and DIMII are compared. The results confirm that in certain cases when second-order interactions among components are accounted for, the importance ranking of the components may differ from that produced by a first-order sensitivity measure.  相似文献   

4.
This paper deals with the use of importance measures (IMs) for the risk-informed optimization of system design and operation. It builds on previous work by the authors in which IMs are incorporated in the formulation of a genetic algorithm (GA) multi-objective optimization problem to drive the design towards a solution which is ‘balanced’ in the importance values of the components. This allows designing systems that are optimal from the point of view of economics and safety, without excessively low- or unnecessarily high-performing components.Different definitions of IMs quantify the risk- or safety-significance of components according to specific views of their role in the system: depending on the optimization problem at hand (e.g. system design optimization and/or maintenance strategy optimization) the use of one IM definition as a balancing criterion may be more appropriate than another.In this regard, a comparison of the Fussell-Vesely (FV), Birnbaum (B) and risk achievement worth (RAW) IMs is performed, with respect to their appropriateness for the optimization of test/maintenance intervals.The RAW is found inappropriate for the purpose, since this measure relates to the defense of the system against the failure of components, which is independent on how often the component is tested.Instead, the use of the FV or B measures allows allocating test/maintenance activities according to the importance of the components they relate to, in agreement with the principle of the risk-informed philosophy of avoiding unnecessary regulatory burdens and defining more efficient inspection and maintenance activities.  相似文献   

5.
In the current quantification of fire probabilistic risk assessment (PRA), when components are damaged by a fire, the basic event values of the components become ‘true’ or one (1), which removes the basic events related to the components from the minimal cut sets, and which makes it difficult to calculate accurate component importance measures. A new method to accurately calculate an importance measure such as Fussell-Vesely in fire PRA is introduced in this paper. Also, a new quantification algorithm in the fire PRA model is proposed to support the new calculation method of the importance measures. The effectiveness of the new method in finding the importance measures is illustrated with an example of evaluating cables’ importance.  相似文献   

6.
A truncation process aims to determine among the set of minimal cut-sets (MCS) produced by a probabilistic safety assessment (PSA) model which of them are significant. Several truncation processes have been proposed for the evaluation of the probability of core damage ensuring a fixed accuracy level. However, the evaluation of new risk indicators as importance measures requires to re-examine the truncation process in order to ensure that the produced estimates will be accurate enough. In this paper a new truncation process is developed permitting to estimate from a single set of MCS the importance measure of any basic event with the desired accuracy level. The main contribution of this new method is to propose an MCS-wise truncation criterion involving two thresholds: an absolute threshold in addition to a new relative threshold concerning the potential probability of the MCS of interest. The method has been tested on a complete level 1 PSA model of a 900 MWe NPP developed by “Electricité de France” (EDF) and the results presented in this paper indicate that to reach the same accuracy level the proposed method produces a set of MCS whose size is significantly reduced.  相似文献   

7.
A risk-informed safety significance categorization (RISSC) is to categorize structures, systems, or components (SSCs) of a nuclear power plant (NPP) into two or more groups, according to their safety significance using both probabilistic and deterministic insights. In the conventional methods for the RISSC, the SSCs are quantitatively categorized according to their importance measures for the initial categorization. The final decisions (categorizations) of SSCs, however, are qualitatively made by an expert panel through discussions and adjustments of opinions by using the probabilistic insights compiled in the initial categorization process and combining the probabilistic insights with the deterministic insights. Therefore, owing to the qualitative and linear decision-making process, the conventional methods have the demerits as follows: (1) they are very costly in terms of time and labor, (2) it is not easy to reach the final decision, when the opinions of the experts are in conflict and (3) they have an overlapping process due to the linear paradigm (the categorization is performed twice—first, by the engineers who propose the method, and second, by the expert panel). In this work, a method for RISSC using the analytic hierarchy process (AHP) and bayesian belief networks (BBN) is proposed to overcome the demerits of the conventional methods and to effectively arrive at a final decision (or categorization). By using the AHP and BBN, the expert panel takes part in the early stage of the categorization (that is, the quantification process) and the safety significance based on both probabilistic and deterministic insights is quantified. According to that safety significance, SSCs are quantitatively categorized into three categories such as high safety significant category (Hi), potentially safety significant category (Po), or low safety significant category (Lo). The proposed method was applied to the components such as CC-V073, CV-V530, and SI-V644 in Ulchin Unit 3 NPP in South Korea. The expert panel consisted of two probabilistic safety assessments (PSA) experts and one system design expert. Before categorizing the components, the design basis functions, simplified P and IDs, and the Fussell-Vesely (FV) importance and the Risk Achievement Worth (RAW) of the PSA were prepared for the experts' evaluations. By using this method, we could categorize the components quantitatively on the basis of experts' knowledge and experience in an early stage.  相似文献   

8.
For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information.  相似文献   

9.
Existing measures of the risk significance of elements of risk models (such as the Fussell–Vesely, or ‘F–V’, importance of basic events) are based on the properties of cut sets containing the element. A measure of safety significance (prevention worth, or ) is proposed, based on the properties of path sets containing the element. A high value of F–V means that cut sets containing the element contribute significantly to top event frequency; a high value of means that path sets containing the element contribute significantly to top event prevention. The properties of as a measure of basic event significance are illustrated first with a simple block diagram example, and then with an example based on nuclear power plant risk models. can also be understood as a property of a set of success scenarios, and as such, can be applied more broadly than just as a measure of element significance.  相似文献   

10.
In this work, we use a mathematical model for dengue transmission with the aim of analysing and comparing two dengue epidemics that occurred in Salvador, Brazil, in 1995-1996 and 2002. Using real data, we obtain the force of infection, Λ, and the basic reproductive number, R(0), for both epidemics. We also obtain the time evolution of the effective reproduction number, R(t), which results in a very suitable measure to compare the patterns of both epidemics. Based on the analysis of the behaviour of R(0) and R(t) in relation to the adult mosquito control parameter of the model, we show that the control applied only to the adult stage of the mosquito population is not sufficient to stop dengue transmission, emphasizing the importance of applying the control to the aquatic phase of the mosquito.  相似文献   

11.
This paper presents a vital area identification method based on the current probabilistic safety assessment (PSA) techniques. The vital area identification method in this paper is focused on core melt rather than radioactive material release. Furthermore, it describes a conceptual framework with which the risk from sabotage-induced events could be assessed.Location minimal cut sets (MCSs) are evaluated after developing a core melt location fault tree (LFT). LFT is a fault tree whose basic events are sabotage-induced damages on the locations within which various safety-related components are located. The core melt LFT is constructed by combining all sequence LFTs of various event trees with OR gates. Each sequence LFT is constructed by combining the initiating event LFT and the mitigating event LFTs with an AND gate. The vital area could be identified by using the location importance measures on the core melt location MCSs. An application was made to a typical 1000 MWe pressurized water reactor power plant located at the Korean seashore.The methodology suggested in the present paper is believed to be very consistent and most complete in identifying the vital areas in a nuclear power plant because it is based on the well-proven PSA technology.  相似文献   

12.
In a fault tree analysis, an uncertainty importance measure is used to identify those basic events that significantly contribute to the uncertainty of the top-event probability. This paper defines an uncertainty importance measure of a basic event or of a group of basic events, and develops a two-stage procedure for experimentally evaluating the measure under the assumption that the probability of each basic event follows a lognormal distribution. The proposed method utilizes the Taguchi tolerance design technique with modifications. Then, the so-called contribution ratios which evaluate the main and/or interaction effects of the uncertainties of log-transformed basic-event probabilities on the uncertainty of the log-transformed top-event probability are calculated. The contribution ratios are used to estimate the defined uncertainty importance measure of a basic event or of a group of basic events. The proposed method consists of two stages for computational efficiency. In the first stage, the basic events with negligible effects on the uncertainty of the log-transformed top-event probability are screened out, and more detailed analyses are conducted in the second stage with a substantially smaller number of basic events. In addition, this paper presents an analysis method to quantify the percentage reduction in the uncertainty of the log-transformed top-event probability when the uncertainty of each basic-event probability is reduced.  相似文献   

13.
It is well known that material requirements planning (MRP) is a deterministic planning tool which heavily relies on accurate demand forecast, and that its performance under perturbed, constantly changing environment becomes questionable. Although many companies still use basic MRP as a planning tool, there are other alternative tools available. For example, Factory Physics Inc. has developed a tool called dynamic risk-based scheduling (DRS), which creates a set of policy parameters (e.g. work in process (WIP) level, lot sizes, re-order point, and re-order quantity) that work for a range of situations. The main objective of this paper is to compare DRS and MRP scheduling systems for single-machine and multi-machine systems via simulation. Their performance measures are compared for various systems assuming three levels of demand uncertainty. The simulation results suggest that DRS outperforms MRP in terms of robustness, fill rate and inventory level.  相似文献   

14.
PSAs in the design of advanced reactors are applied mainly in level 1 PSA areas. However, even in level 1 PSA, there are certain areas where special care must be taken depending on plant design concepts. This paper identifies these areas both for passive and active safety reactor concepts. For example, ‘long-term PSA’ and shutdown PSA are very important for a passive safety reactor concept from the standpoint of effectiveness of a grace period and passive safety systems. External events are also important for an active safety reactor concept. These kinds of special PSAs are difficult to conduct precisely in a conceptual design stage. This paper shows methods of conducting these kinds of special PSAs simply and conveniently and the use of acquired insights for the design of advanced reactors. This paper also clarifies the meaning or definition of a grace period from the standpoint of PSA.  相似文献   

15.
The present paper deals with the use of probabilistic safety assessment (PSA) importance measures to optimise the performance of a nuclear power plant. This article is intended to give an overview on the subject for PSA practitioners. The most frequently used importance measures are shortly addressed. It is shown that two importance measures are sufficient to describe the character of the coredamage-equation. The most often used are the risk achievement and Fussell–Vesely importance, in combination with each other. In the field of nuclear power plant test and maintenance activities the Birnbaum importance is advocated.  相似文献   

16.
We construct a model for living probabilistic safety assessment (PSA) by applying the general framework of marked point processes. The framework provides a theoretically rigorous approach for considering risk follow-up of posterior hazards. In risk follow-up, the hazard of core damage is evaluated synthetically at time points in the past, by using some observed events as logged history and combining it with re-evaluated potential hazards. There are several alternatives for doing this, of which we consider three here, calling them initiating event approach, hazard rate approach, and safety system approach. In addition, for a comparison, we consider a core damage hazard arising in risk monitoring. Each of these four definitions draws attention to a particular aspect in risk assessment, and this is reflected in the behaviour of the consequent risk importance measures. Several alternative measures are again considered. The concepts and definitions are illustrated by a numerical example.  相似文献   

17.
宋帅  钱永久  钱聪 《工程力学》2018,35(3):106-114
桥梁结构的地震需求不仅受地震动随机性的影响,而且受结构中随机参数的影响。为了分析各随机参数对结构地震需求的影响水平,提出采用重要性分析方法对各随机参数进行重要性排序。以常见的简支梁桥及连续梁桥为例,基于结构的非线性动力时程分析,分别采用Monte-Carlo数值模拟方法及核密度估计方法计算得到各随机参数基于方差的重要性测度指标及矩独立重要性测度指标。结果表明,对于中小跨径梁桥,针对桥墩、桥台及支座等不同构件的地震需求,随机参数的重要性排序并不完全相同,但是支座剪切模量、上部结构质量及阻尼比等参数对各构件地震需求的影响水平均排在前列;和Tornado图形法等局部敏感性分析方法相比,重要性分析方法在研究某个参数的重要性时能够考虑其他随机参数的影响,其应用更加合理。  相似文献   

18.
The use of importance measures to analyze PRA results is discussed. Commonly used importance measures are defined. Some issues that have been identified as potentially limiting their usefulness are addressed, namely: there is no simple relationship between importance measures evaluated at the single component level and those evaluated at the level of a group of components, and, as a result, some of the commonly used importance measures are not realistic measures of the sensitivity of the overall risk to parameter value changes; and, importance measures do not typically take into account parameter uncertainties which raises the question of the robustness of conclusions drawn from importance analyses. The issues are explored in the context of both ranking and categorization of structures, systems, and components (SSCs) with respect to risk-significance and safety-significance for use in risk-informed regulatory analyses.  相似文献   

19.
Phased missions consist of consecutive operational phases where the system logic and failure parameters can change between phases. A component can have different roles in different phases and the reliability function may have discontinuities at phase boundaries. An earlier method required NOT-gates and negations of events when calculating importance measures for such missions with non-repairable components. This paper suggests an exact method that uses standard fault tree techniques and Boolean algebra without any NOT-gates or negations. The criticalities and other importance measures can be obtained for events and components relevant to a single phase or to a transition between phases or over the whole mission. The method and importance measures are extended to phased missions with repairable components. Quantification of the reliability, the availability, the failure intensity and the total number of failures are described. New importance indicators defined for repairable systems measure component contributions to the total integrated unavailability, to the mission failure intensity and to the total number of mission failures.  相似文献   

20.
Reverse engineering of gene regulatory network (GRN) is an important and challenging task in systems biology. Existing parameter estimation approaches that compute model parameters with the same importance are usually computationally expensive or infeasible, especially in dealing with complex biological networks.In order to improve the efficiency of computational modeling, the paper applies a hierarchical estimation methodology in computational modeling of GRN based on topological analysis. This paper divides nodes in a network into various priority levels using the graph‐based measure and genetic algorithm. The nodes in the first level, that correspond to root strongly connected components(SCC) in the digraph of GRN, are given top priority in parameter estimation. The estimated parameters of vertices in the previous priority level ARE used to infer the parameters for nodes in the next priority level. The proposed hierarchical estimation methodology obtains lower error indexes while consuming less computational resources compared with single estimation methodology. Experimental outcomes with insilico networks and a realistic network show that gene networks are decomposed into no more than four levels, which is consistent with the properties of inherent modularity for GRN. In addition, the proposed hierarchical parameter estimation achieves a balance between computational efficiency and accuracy.Inspec keywords: biology computing, network theory (graphs), reverse engineering, graph theory, genetics, genetic algorithms, directed graphs, parameter estimationOther keywords: hierarchical parameter estimation, GRN, topological analysis, gene regulatory network, important task, computational systems biology, compute model parameters, complex biological networks, efficient information, model quality, parameter reliability, computational modelling, study divides nodes, priority levels, graph‐based measure, previous priority level, hierarchical estimation methodology obtains, computational resources, single time estimation, insilico network, realistic network show, computational efficiency  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号