首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A Probabilistic Safety Analysis expresses uncertainty about the possible future damaging consequences of complex installations, such as chemical or nuclear plants, in terms of probabilities. Often these probabilities are interpreted as measures of physical properties of the installation and how it is operated, and a PSA is seen as a means to carry out these measurements. This view is useful for making comparative statements about the riskiness of installations, particularly when comparing them with standards. It is argued here, however, that this interpretation of probability is inconsistent with all the standard philosophical theories of probability, and so a different interpretation of PSA is necessary. We suggest, alternatively, that by using the standard subjective theory of probability, PSA may be seen as a tool for argument, rather than an objective representation of truth. In this interpretation the problems of expert choice and model validation become less problematic.  相似文献   

2.
This work aims to evaluate the potential risks of incidents in nuclear research reactors. For its development, two databases of the International Atomic Energy Agency (IAEA) were used: the Research Reactor Data Base (RRDB) and the Incident Report System for Research Reactor (IRSRR). For this study, the probabilistic safety analysis (PSA) was used. To obtain the result of the probability calculations for PSA, the theory and equations in the paper IAEA TECDOC-636 were used. A specific program to analyse the probabilities was developed within the main program, Scilab 5.1.1. for two distributions, Fischer and chi-square, both with the confidence level of 90 %. Using Sordi equations, the maximum admissible doses to compare with the risk limits established by the International Commission on Radiological Protection (ICRP) were obtained. All results achieved with this probability analysis led to the conclusion that the incidents which occurred had radiation doses within the stochastic effects reference interval established by the ICRP-64.  相似文献   

3.
Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105,000 lives. Fuelled by greater Tokyo's rich seismological record, but challenged by its magnificent complexity, our joint Japanese-US group carried out a new study of the capital's earthquake hazards. We used the prehistoric record of great earthquakes preserved by uplifted marine terraces and tsunami deposits (17 M approximately 8 shocks in the past 7000 years), a newly digitized dataset of historical shaking (10000 observations in the past 400 years), the dense modern seismic network (300,000 earthquakes in the past 30 years), and Japan's GeoNet array (150 GPS vectors in the past 10 years) to reinterpret the tectonic structure, identify active faults and their slip rates and estimate their earthquake frequency. We propose that a dislodged fragment of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath the Kanto plain on which Tokyo sits. We suggest that the Kanto fragment controls much of Tokyo's seismic behaviour for large earthquakes, including the damaging 1855 M approximately 7.3 Ansei-Edo shock. On the basis of the frequency of earthquakes beneath greater Tokyo, events with magnitude and location similar to the M approximately 7.3 Ansei-Edo event have a ca 20% likelihood in an average 30 year period. In contrast, our renewal (time-dependent) probability for the great M > or = 7.9 plate boundary shocks such as struck in 1923 and 1703 is 0.5% for the next 30 years, with a time-averaged 30 year probability of ca 10%. The resulting net likelihood for severe shaking (ca 0.9 g peak ground acceleration (PGA)) in Tokyo, Kawasaki and Yokohama for the next 30 years is ca 30%. The long historical record in Kanto also affords a rare opportunity to calculate the probability of shaking in an alternative manner exclusively from intensity observations. This approach permits robust estimates for the spatial distribution of expected shaking, even for sites with few observations. The resulting probability of severe shaking is ca 35% in Tokyo, Kawasaki and Yokohama and ca 10% in Chiba for an average 30 year period, in good agreement with our independent estimate, and thus bolstering our view that Tokyo's hazard looms large. Given 1 trillion US dollars estimates for the cost of an M approximately 7.3 shock beneath Tokyo, our probability implies a 13 billion US dollars annual probable loss.  相似文献   

4.
Traditional fault tree (FT) analysis is widely used for reliability and safety assessment of complex and critical engineering systems. The behavior of components of complex systems and their interactions such as sequence- and functional-dependent failures, spares and dynamic redundancy management, and priority of failure events cannot be adequately captured by traditional FTs. Dynamic fault tree (DFT) extend traditional FT by defining additional gates called dynamic gates to model these complex interactions. Markov models are used in solving dynamic gates. However, state space becomes too large for calculation with Markov models when the number of gate inputs increases. In addition, Markov model is applicable for only exponential failure and repair distributions. Modeling test and maintenance information on spare components is also very difficult. To address these difficulties, Monte Carlo simulation-based approach is used in this work to solve dynamic gates. The approach is first applied to a problem available in the literature which is having non-repairable components. The obtained results are in good agreement with those in literature. The approach is later applied to a simplified scheme of electrical power supply system of nuclear power plant (NPP), which is a complex repairable system having tested and maintained spares. The results obtained using this approach are in good agreement with those obtained using analytical approach. In addition to point estimates of reliability measures, failure time, and repair time distributions are also obtained from simulation. Finally a case study on reactor regulation system (RRS) of NPP is carried out to demonstrate the application of simulation-based DFT approach to large-scale problems.  相似文献   

5.
The role of PC-based programs is becoming important in the area of PSA. The PC-based program QUEST has been developed to perform level-1 PSA at PNC. This program is an effective tool to examine the effects of the change in the plant design and/or operational procedures. Also, as a part of the full scope PSA activity for the prototype liquid metal fast breeder reactor, a systems analysis code network, which involves some PC-based programs, was developed and has been utilized to perform level-1 PSA with less manpower and more consistency. Further, a living PSA tool is currently being developed for the purpose of maintenance of or improvement in operational safety.  相似文献   

6.
地震安全标准化综述   总被引:1,自引:0,他引:1  
一、引言2005年,第36届世界标准日的主题是“标准使世界更安全”,在今年的世界标准日祝词中,提到了“地震”是造成人类社会不安全的首要自然灾害。近年来强大地震造成严重的生命和财产损失,其社会影响之大,危害之烈,严重地威胁着各国的经济发展和社会安全。2004年12月26日印度尼西亚苏门答腊岛附近海域发生的强烈地震所引发的海啸席卷了印度洋沿岸的众多国家和地区,造成了7个国家共16多万人死亡,巨大的灾害引起了全世界的广泛关注。这次海啸酿成重灾的原因之一,是由于印度洋沿岸国家尚未建立海啸预警系统,尽管一些国际机构及时发出了海啸预…  相似文献   

7.
Analysis of truncation limit in probabilistic safety assessment   总被引:3,自引:4,他引:3  
A truncation limit defines the boundaries of what is considered in the probabilistic safety assessment and what is neglected. The truncation limit that is the focus here is the truncation limit on the size of the minimal cut set contribution at which to cut off. A new method was developed, which defines truncation limit in probabilistic safety assessment. The method specifies truncation limits with more stringency than presenting existing documents dealing with truncation criteria in probabilistic safety assessment do. The results of this paper indicate that the truncation limits for more complex probabilistic safety assessments, which consist of larger number of basic events, should be more severe than presently recommended in existing documents if more accuracy is desired. The truncation limits defined by the new method reduce the relative errors of importance measures and produce more accurate results for probabilistic safety assessment applications. The reduced relative errors of importance measures can prevent situations, where the acceptability of change of equipment under investigation according to RG 1.174 would be shifted from region, where changes can be accepted, to region, where changes cannot be accepted, if the results would be calculated with smaller truncation limit.  相似文献   

8.
利用谐小波变换对实际强震记录的时变谱进行估计,并统计分析了与抗震规范相容的3类不同场地上地震波的时变谱特征,利用均匀调制非平稳随机模型和时变修正Kanai-Tajimi非平稳随机模型模拟地震波的时变谱,把非线性函数的参数识别问题转化成求解无约束优化问题,利用拟牛顿迭代法求得最优解,得到3类不同场地上这两种模型的参数的具体取值以及参数函数集的具体表达式;研究了以谱强度因子的平方根为强度参数的双向地震动作用下隔震结构概率地震需求分析的步骤,对比了不同的地震动随机模型对隔震结构概率地震需求分析的影响,研究发现在小震作用下,地震动随机模型对隔震结构的地震响应影响不大,但在大震作用下,地震动随机模型对结构响应有较大影响,而且,对于不同场地类别,地震动随机模型对隔震结构概率地震需求分析的影响也不相同.  相似文献   

9.
The ω-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the ω-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the ω-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents.  相似文献   

10.
An efficient point estimate method for probabilistic analysis   总被引:3,自引:0,他引:3  
A new and efficient point estimate method is developed to calculate the statistical moments of a random quantity, Z, that is a function of n random variables, X. The method is an extension of Rosenblueth's two-point concentration method. The method uses m × n concentrations matching up to the first m × n non-crossed moments of each random variable and crossed second order moments of the random variables. The kth moment of Z is calculated by weighting the value of Z to the power of k evaluated at n × m locations. Simple to use formulas are provided for two special cases of the method, i.e. 2n-concentration scheme and 2n + 1-concentration scheme. This 2n-concentration scheme considers the skewness of probability density function. The 2n + 1-concentration scheme considers the skewness and kurtosis of probabilility density function. The correlations between the random variables are considered by using a rotational transformation based on the eigenvector of covariance matrix. Illustrative examples are presented.  相似文献   

11.
We construct a model for living probabilistic safety assessment (PSA) by applying the general framework of marked point processes. The framework provides a theoretically rigorous approach for considering risk follow-up of posterior hazards. In risk follow-up, the hazard of core damage is evaluated synthetically at time points in the past, by using some observed events as logged history and combining it with re-evaluated potential hazards. There are several alternatives for doing this, of which we consider three here, calling them initiating event approach, hazard rate approach, and safety system approach. In addition, for a comparison, we consider a core damage hazard arising in risk monitoring. Each of these four definitions draws attention to a particular aspect in risk assessment, and this is reflected in the behaviour of the consequent risk importance measures. Several alternative measures are again considered. The concepts and definitions are illustrated by a numerical example.  相似文献   

12.
Development of probabilistic sensitivities is frequently considered an essential component of a probabilistic analysis and often critical towards understanding the physical mechanisms underlying failure and modifying the design to mitigate and manage risk. One useful sensitivity is the partial derivative of the probability-of-failure and/or the system response with respect to the parameters of the independent input random variables. Calculation of these partial derivatives has been established in terms of an expected value operation (sometimes called the score function or likelihood ratio method). The partial derivatives can be computed with typically insignificant additional computational cost given the failure samples and kernel functions — which are the partial derivatives of the log of the probability density function (PDF) with respect to the parameters of the distribution. The formulation is general such that any sampling method can be used for the computation such as Monte Carlo, importance sampling, Latin hypercube, etc. In this paper, useful universal properties of the kernel functions that must be satisfied for all two parameter independent distributions are derived. These properties are then used to develop distribution-free analytical expressions of the partial derivatives of the response moments (mean and standard deviation) with respect to the PDF parameters for linear and quadratic response functions. These universal properties can be used to facilitate development and verification of the required kernel functions and to develop an improved understanding of the model for design considerations.  相似文献   

13.
This paper is concerned with how management and organisational influences can be factored into risk assessments. A case study from the rail transportation sector illustrates how organisational factors can act as high level influences which are manifest as operational errors giving rise to major accidents. A model is proposed which describes the interrelationships between management influences, immediate causes and operational errors. This model can be used for organisational auditing, monitoring and system design. A strategy is described for collecting data from an existing organisation to develop a specific form of the generic model. The final issue addressed is the use of the model to quantify the effects of organisational influences on risk arising from human error. A numerical case study is provided to illustrate the approach.  相似文献   

14.
A truncation process aims to determine among the set of minimal cut-sets (MCS) produced by a probabilistic safety assessment (PSA) model which of them are significant. Several truncation processes have been proposed for the evaluation of the probability of core damage ensuring a fixed accuracy level. However, the evaluation of new risk indicators as importance measures requires to re-examine the truncation process in order to ensure that the produced estimates will be accurate enough. In this paper a new truncation process is developed permitting to estimate from a single set of MCS the importance measure of any basic event with the desired accuracy level. The main contribution of this new method is to propose an MCS-wise truncation criterion involving two thresholds: an absolute threshold in addition to a new relative threshold concerning the potential probability of the MCS of interest. The method has been tested on a complete level 1 PSA model of a 900 MWe NPP developed by “Electricité de France” (EDF) and the results presented in this paper indicate that to reach the same accuracy level the proposed method produces a set of MCS whose size is significantly reduced.  相似文献   

15.
The multi-element probabilistic collocation method (ME-PCM) as a tool for sensitivity analysis of differential equation models as applied to cellular signalling networks is formulated. This method utilises a simple, efficient sampling algorithm to quantify local sensitivities throughout the parameter space. The application of the ME-PCM to a previously published ordinary differential equation model of the apoptosis signalling network is presented. The authors verify agreement with the previously identified regions of sensitivity and then go on to analyse this region in greater detail with the ME-PCM. The authors demonstrate the generality of the ME-PCM by studying sensitivity of the system using a variety of biologically relevant markers in the system such as variation in one (or many) chemical species as a function of time, and total exposure of a single chemical species.  相似文献   

16.
Seismic PSA was carried out for a typical liquid metal cooled fast breeder reactor (LMFBR) in order to study the rationalized seismic design, maintaining and/or improving safety during seismic event. The seismic sequence quantification identifies the dominant structures, systems and components (SSCs) to the seismic core damage frequency (CDF). The sensitivity analyses by reducing or increasing the seismic capacity for SSCs are used to examine the optimized seismic design in view of safety and economical aspects. The LMFBR-specific risk-significant SSCs are reactor coolant boundary, decay heat removal coolant path and reactor control rod, which are different from those of light water reactors (LWRs). The electrical power supply system has a minor contribution to the seismic CDF. The sensitivity study shows that passive safety features of LMFBRs are important to maintain and/or enhance seismic capacity. The passive safety includes the decay heat removal capability via natural circulation and safety measures without depending on the support systems such as alternating current (AC) electrical power, for example. On the course of seismic sequence quantification, a methodology to evaluate the probability of seismic-induced multiple failure has been developed and applied to the decay heat removal function. The results suggest the multiplicity of the triply redundant system is to be considered for the significant components such as the decay heat removal path when one considers the difference in the seismic response.  相似文献   

17.
Probabilistic Safety Assessments (PSAs) performed by utilities in the framework of Periodic Safety Reviews for German nuclear power plants are reviewed by TÜV Südwest. Insights gained and recommendations concerning the necessity and focus of further developments and applications according to practical requests for the performance and assessment of PSAs within regulatory procedures are presented in this paper. Further on, recommendations are made in order to ensure the validity of the results of PSAs necessary in order to achieve the goals thereof. Beside some general points of view the emphasis of the paper is on methodological aspects with respect to evaluation methods and assessment of common cause failures as well as human reliability assessment.  相似文献   

18.
A challenging problem in mathematically processing uncertain operands is that constraints inherent in the problem definition can require computations that are difficult to implement. Examples of possible constraints are that the sum of the probabilities of partitioned possible outcomes must be one, and repeated appearances of the same variable must all have the identical value. The latter, called the ‘repeated variable problem’, will be addressed in this paper in order to show how interval-based probabilistic evaluation of Boolean logic expressions, such as those describing the outcomes of fault trees and event trees, can be facilitated in a way that can be readily implemented in software. We will illustrate techniques that can be used to transform complex constrained problems into trivial problems in most tree logic expressions, and into tractable problems in most other cases.  相似文献   

19.
A theoretical approach and an experimental test system devoted to introducing a set of parameters based on Walsh functions and conformed to characterize the transfer function of analog-to-digital converters are presented. Building on the previous work, the authors propose an enhanced system that provides better accuracy in the evaluation of the performance of conversion devices under dynamic conditions. The theory covers an introductive approach to generalize the conversion processes and employs a powerful purpose-oriented tool to understand their in-depth operativity. The error parameters are defined by mathematical algorithms based on Walsh functions and related transform, while their properties are correlated to a standard reference input, a triangular waveform provided by the system. This methodology opens the wave towards the introduction of standard techniques in testing conversion devices  相似文献   

20.
A general method is developed for conducting simple operations on random variables, avoiding difficult integrals and singularities, which must be overcome when obtaining exact solutions. For sum, difference and product operations, and combinations thereof, exact moments are first determined from the moments of the constituent variables. The method of orthogonal expansion, developed in the previous paper [Probabilistic Engineering Mechanics 2000;15:371–379], is then used to produce approximate probability density functions (PDFs). The quotient operation is also considered; it requires knowledge of the negative moments of the denominator variable. The quotient and difference operations are used in a first example to establish PDFs for the hazard quotient and excess wind loading on a concrete chimney. A second example demonstrates how the proposed method may be used as an alternative to Monte Carlo simulation for simple probabilistic risk calculations; a PDF for predicted contaminant concentration at a groundwater well compares favorably with a histogram obtained by simulation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号