首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
By means of several examples from a recent comprehensive space nuclear risk analysis of the Cassini mission, a scenario and consequence representational framework is presented for risk analysis of space nuclear power systems in the context of epistemic and aleatory uncertainties. The framework invites the use of probabilistic models for the calculation of both event probabilities and scenario consequences. Each scenario is associated with a frequency that may include both aleatory and epistemic uncertainties. The outcome of each scenario is described in terms of an end state vector. The outcome of each scenario is also characterized by a source term. In this paper, the source term factors of interest are number of failed clads in the space nuclear power system, amount of fuel released and amount of fuel that is potentially respirable. These are also subject to uncertainties. The 1990 work of Apostolakis is found to be a useful formalism from which to derive the relevant probabilistic models. However, an extension to the formalism was necessary to accommodate the situation in which aleatory uncertainty is represented by changes in the form of the probability function itself, not just its parameters. Event trees that show reasonable alternative accident scenarios are presented. A grouping of probabilities and consequences is proposed as a useful structure for thinking about uncertainties. An example of each category is provided. Concluding observations are made about the judgments involved in this analysis of uncertainties and the effect of distinguishing between aleatory and epistemic uncertainties.  相似文献   

2.
Hazard and risk assessment in avalanche-prone areas involves estimation of runout distances of potential avalanches. Methods for determination of the runout may be divided into two categories: 1) methods based on statistical approaches such as the well known α-β model or 2) methods based on numerical avalanche models such as the PCM-model or VS-type models (just to name the more traditional ones). Methods in the second group have the advantage that besides the runout distance, velocity and impact pressure distributions along the avalanche track can also be obtained, this being a requisite for meaningful risk assessments. However, the predictive power of dynamical models depends on the use of appropriate rheological models and their parameters.In the statistical α-β model, the maximum runout distance is solely a function of topography. The runout distance equations were found by regression analysis, correlating the longest registered runout distance of several hundred avalanche paths with a selection of topographic parameters.In this paper, we re-evaluate Norwegian and Austrian avalanche data, which served as basis for the α-β model in the respective countries, and additional avalanche data with respect to dynamical measures. As most of those avalanche data originate more or less from extreme events (i.e. avalanches with return periods of the order of 100 years), the dynamical measures may give hints about an appropriate rheology for dynamical models suitable for extreme avalanche events.The analysis raises reasonable doubt whether the classical ansatz for the retarding acceleration of snow avalanches with additive terms involving Coulomb-friction and a velocity-squared dependency, which is used in many avalanche models, is adequate for a physically-based model. Back-calculations of runout distances using a simple block model show a discrepancy between commonly proposed parameter values (and of the underlying rheological models) and the observations.  相似文献   

3.
A methodology is described for probabilistic predictions of future climate. This is based on a set of ensemble simulations of equilibrium and time-dependent changes, carried out by perturbing poorly constrained parameters controlling key physical and biogeochemical processes in the HadCM3 coupled ocean-atmosphere global climate model. These (ongoing) experiments allow quantification of the effects of earth system modelling uncertainties and internal climate variability on feedbacks likely to exert a significant influence on twenty-first century climate at large regional scales. A further ensemble of regional climate simulations at 25km resolution is being produced for Europe, allowing the specification of probabilistic predictions at spatial scales required for studies of climate impacts. The ensemble simulations are processed using a set of statistical procedures, the centrepiece of which is a Bayesian statistical framework designed for use with complex but imperfect models. This supports the generation of probabilities constrained by a wide range of observational metrics, and also by expert-specified prior distributions defining the model parameter space. The Bayesian framework also accounts for additional uncertainty introduced by structural modelling errors, which are estimated using our ensembles to predict the results of alternative climate models containing different structural assumptions. This facilitates the generation of probabilistic predictions combining information from perturbed physics and multi-model ensemble simulations. The methodology makes extensive use of emulation and scaling techniques trained on climate model results. These are used to sample the equilibrium response to doubled carbon dioxide at any required point in the parameter space of surface and atmospheric processes, to sample time-dependent changes by combining this information with ensembles sampling uncertainties in the transient response of a wider set of earth system processes, and to sample changes at local scales.The methodology is necessarily dependent on a number of expert choices, which are highlighted throughout the paper.  相似文献   

4.
Experimental observations of creep life in structural systems exhibit significant scatter. As a result, probabilistic methods that incorporate the associated uncertainties in residual life assessment methodologies have been developed. Most studies in the literature adopt Gaussian models for the noise. However, Monte Carlo simulations with such models lead to possibilities of physically unrealistic negative damage growth increments in certain sample damage growth realizations. This study investigates the use of alternative models for noise in damage growth analyses and compares these predictions with those obtained when Gaussian models are used. A continuum damage mechanics based approach is used for obtaining the thermal creep damage growth in a nuclear power plant component. Numerical results are presented to highlight the salient features arising from this study.  相似文献   

5.
Earthquake loss estimation procedures exhibit aleatory and epistemic uncertainty imbedded in their various components; i.e. seismic hazard, structural fragility, and inventory data. Since these uncertainties significantly affect decision-making, they have to be considered in loss estimation to inform decision- and policymakers and to ensure a balanced view of the various threats to which society may be subjected. This paper reviews the uncertainties that affect earthquake loss estimation and proposes a simple framework for probabilistic uncertainty assessment suitable for use after obtaining impact results from existing software, such as HAZUS-MH. To avoid the extensive calculations required for Monte Carlo simulation-based approaches, this study develops an approximate method for uncertainty propagation based on modifying the quantile arithmetic methodology, which allows for acceptable uncertainty estimates with limited computational effort. A verification example shows that the results by the approximation approach are in good agreement with the equivalent Monte Carlo simulation outcome. Finally, the paper demonstrates the proposed procedure for probabilistic loss assessment through a comparison with HAZUS-MH results. It is confirmed that the proposed procedure consistently gives reasonable estimates.  相似文献   

6.
Reliable and accurate predictions of infrastructure condition can save significant amounts of money for infrastructure management agencies through better planned maintenance and rehabilitation activities. Infrastructure deterioration is a complicated, dynamic and stochastic process affected by various factors such as design, environmental conditions, material properties, structural capacities and some unobserved variables. Previous researchers have explored different types of modelling techniques, ranging from simple deterministic models to sophisticated probabilistic models, to characterise the deterioration process of infrastructure systems; however, these models have limitations in various aspects. Traditional deterministic models are inadequate to capture the uncertainties associated with infrastructure deterioration processes. State-based probabilistic models can only predict conditions at fixed time points. Time-based probabilistic models require frequent observations that, in practice, are not easy to perform. The goal of this research is to develop a new probabilistic model that is capable of capturing the stochastic nature of infrastructure deterioration, while at the same time avoiding the limitations of previous modelling efforts. The proposed nested model is based on discrete choice model theory. It can be used to predict the probability of an infrastructure system staying at defined condition states by relating an index representing the performance of the infrastructure to a number of explanatory variables that characterise the structural adequacy, traffic loading and environmental conditions of the infrastructure. The proposed model includes different possible implementation paths (sequential versus multinomial) depending on the considered explanatory variables and the available data. In the case study, the proposed probabilistic model is implemented with pavement performance data collected in Texas, yielding promising preliminary results.  相似文献   

7.
提出了基于贝叶斯理论的地震风险评估方法,综合考虑了地震危险性模型、输入地震动记录、结构参数和需求模型的不确定性,并以云南大理地区1970年-2017年间的地震数据为研究基础进行了详细讨论。在传统基于概率地震危险性分析方法的基础上,提出了基于贝叶斯理论的地震危险性分析方法,通过贝叶斯更新准则,确定了地震概率模型中未知参数的后验概率分布;通过贝叶斯理论建立了基于概率的地震需求模型,并在易损性中考虑了需求模型认知不确定性的影响;以42层钢框架-RC核心筒建筑为例,开展了地震作用下的风险评估。研究表明:基于贝叶斯理论的地震危险性分析方法,能够获得更为合理的危险性模型;忽略需求模型中参数不确定性的影响,将错误估计结构的地震易损性;不同加载工况将对高层建筑的地震风险产生显著影响。提出的概率风险评估方法,提供了可以考虑固有不确定性和认知不确定性的有效途径,有助于推动高性能结构地震韧性评价和设计理论的发展。  相似文献   

8.
This article examines the importance of determining residual risk and its impact on remedy selection at Superfund Sites. Within this examination, risks are assessed using probabilistic models that incorporate the uncertainty and variability of the input parameters, and utilize parameter distributions based on current and applicable site-specific data. Monte Carlo methods are used to propagate these uncertainties and variabilities through the risk calculations resulting in a distribution for the estimate of both risk and residual risk. Such an approach permits an informed decision based on a broad information base which involves considering the entire uncertainty distribution of risk rather than a point estimate for each exposure scenario. Using the probabilistic risk estimates, with current and applicable site-specific data, alternative decisions regarding cleanup are obtained for two Superfund Sites.  相似文献   

9.
Fully probabilistic models are available for predicting the service life of new reinforced concrete structures and for condition assessment of existing structures. Frequently, the decisive mechanism limiting the service life of reinforced concrete structures is chloride-induced corrosion, for which these models predict probabilistically the time to corrosion initiation. Once the corrosion process is initiated, corroding areas can be detected nondestructively through potential mapping. The spatial information gained from potential mapping can then be used for updating the service-life prediction, taking into consideration the spatial variability of the corrosion process. This paper introduces the spatial updating of the probabilistic model with potential mapping and concrete cover measurements by means of Bayesian analysis. A case study is presented, where potential mapping is applied prior to a destructive assessment, which serves to verify the approach. It is found that the potential mapping can provide significant information on the condition state. With the presented methods, this information can be consistently included in the probabilistic service-life prediction.  相似文献   

10.
Software plays an increasingly important role in modern safety-critical systems. Although, research has been done to integrate software into the classical probabilistic risk assessment (PRA) framework, current PRA practice overwhelmingly neglects the contribution of software to system risk. Dynamic probabilistic risk assessment (DPRA) is considered to be the next generation of PRA techniques. DPRA is a set of methods and techniques in which simulation models that represent the behavior of the elements of a system are exercised in order to identify risks and vulnerabilities of the system. The fact remains, however, that modeling software for use in the DPRA framework is also quite complex and very little has been done to address the question directly and comprehensively. This paper develops a methodology to integrate software contributions in the DPRA environment. The framework includes a software representation, and an approach to incorporate the software representation into the DPRA environment SimPRA. The software representation is based on multi-level objects and the paper also proposes a framework to simulate the multi-level objects in the simulation-based DPRA environment. This is a new methodology to address the state explosion problem in the DPRA environment. This study is the first systematic effort to integrate software risk contributions into DPRA environments.  相似文献   

11.
Quantitative risk assessment is recognised to be a sound basis for land use planning in avalanche prone areas. An important feature of an effective risk calculation procedure is its adaptability to territory changes, particularly the construction of defence works, both in the release zone and in the run-out area. In fact, the calculation of residual risk after the realisation of defence works and the possibility to compare different protection solutions in terms of risk reduction, give a substantial help in decision making processes. We present an avalanche risk estimation procedure that combines statistical analysis of snowfall record, iterative simulations of avalanche dynamics and empirically-based vulnerability relations. Our methods provides a risk estimate flexible to boundary and initial condition changes. We discuss in detail the theoretical background of the proposed method and apply it to a real case study, using a 1D dynamical simulation model and a GIS interface to visualise risk levels in the 2D run-out zone. An analysis of different protective countermeasures in a “cost–benefit” framework is provided as well.  相似文献   

12.
余波  陈冰  吴然立 《工程力学》2017,34(7):136-145
现有的钢筋混凝土(RC)柱抗剪承载力计算模型大多属于确定性模型,难以有效考虑几何尺寸、材料特性和外荷载等因素存在的不确定性,导致计算结果的离散性较大,且计算精度和适用性有限。鉴于此,该文结合变角桁架-拱模型和贝叶斯理论,研究建立了剪切型RC柱抗剪承载力计算的概率模型。首先基于变角桁架-拱模型理论,并考虑轴压力对临界斜裂缝倾角的影响,建立了剪切型RC柱抗剪承载力的确定性修正模型;然后考虑主观不确定性和客观不确定性因素的影响,结合贝叶斯理论和马尔科夫链蒙特卡洛(MCMC)法,建立了剪切型RC柱的概率抗剪承载力计算模型;最后通过与试验数据和现有模型的对比分析,验证了该模型的有效性和实用性。分析结果表明,该模型不仅可以合理描述剪切型RC柱抗剪承载力的概率分布特性,而且可以校准现有确定性计算模型的置信水平,并且可以确定不同置信水平下剪切型RC柱抗剪承载力的特征值。  相似文献   

13.
Model predictions for a rapid assessment and prognosis of possible radiological consequences after an accidental release of radionuclides play an important role in nuclear emergency management. Radiological observations, e.g. dose rate measurements, can be used to improve such model predictions. The process of combining model predictions and observations, usually referred to as data assimilation, is described in this article within the framework of the real time on-line decision support system (RODOS) for off-site nuclear emergency management in Europe. Data assimilation capabilities, based on Kalman filters, are under development for several modules of the RODOS system, including the atmospheric dispersion, deposition, food chain and hydrological models. The use of such a generic data assimilation methodology enables the propagation of uncertainties throughout the various modules of the system. This would in turn provide decision makers with uncertainty estimates taking into account both model and observation errors. This paper describes the methodology employed as well as results of some preliminary studies based on simulated data.  相似文献   

14.
This paper describes the implementation of topographic curvature effects within the RApid Mass MovementS (RAMMS) snow avalanche simulation toolbox. RAMMS is based on a model similar to shallow water equations with a Coulomb friction relation and the velocity dependent Voellmy drag. It is used for snow avalanche risk assessment in Switzerland. The snow avalanche simulation relies on back calculation of observed avalanches. The calibration of the friction parameters depends on characteristics of the avalanche track. The topographic curvature terms are not yet included in the above mentioned classical model. Here, we fundamentally improve this model by mathematically and physically including the topographic curvature effects. By decomposing the velocity dependent friction into a topography dependent term that accounts for a curvature enhancement in the Coulomb friction, and a topography independent contribution similar to the classical Voellmy drag, we construct a general curvature dependent frictional resistance, and thus propose new extended model equations. With three site-specific examples, we compare the apparent frictional resistance of the new approach, which includes topographic curvature effects, to the classical one. Our simulation results demonstrate substantial effects of the curvature on the flow dynamics e.g., the dynamic pressure distribution along the slope. The comparison of resistance coefficients between the two models demonstrates that the physically based extension presents an improvement to the classical approach. Furthermore a practical example highlights its influence on the pressure outline in the run out zone of the avalanche. Snow avalanche dynamics modeling natural terrain curvature centrifugal force friction coefficients.  相似文献   

15.
Random uncertainties in finite element models in linear structural dynamics are usually modeled by using parametric models. This means that: (1) the uncertain local parameters occurring in the global mass, damping and stiffness matrices of the finite element model have to be identified; (2) appropriate probabilistic models of these uncertain parameters have to be constructed; and (3) functions mapping the domains of uncertain parameters into the global mass, damping and stiffness matrices have to be constructed. In the low-frequency range, a reduced matrix model can then be constructed using the generalized coordinates associated with the structural modes corresponding to the lowest eigenfrequencies. In this paper we propose an approach for constructing a random uncertainties model of the generalized mass, damping and stiffness matrices. This nonparametric model does not require identifying the uncertain local parameters and consequently, obviates construction of functions that map the domains of uncertain local parameters into the generalized mass, damping and stiffness matrices. This nonparametric model of random uncertainties is based on direct construction of a probabilistic model of the generalized mass, damping and stiffness matrices, which uses only the available information constituted of the mean value of the generalized mass, damping and stiffness matrices. This paper describes the explicit construction of the theory of such a nonparametric model.  相似文献   

16.
Fault tree analysis is a method largely used in probabilistic risk assessment. Uncertainties should be properly handled in fault tree analyses to support a robust decision making. While many sources of uncertainties are considered, dependence uncertainties are not much explored. Such uncertainties can be labeled as ‘epistemic’ because of the way dependence is modeled. In practice, despite probability theory, alternative mathematical structures, including possibility theory and fuzzy set theory, for the representation of epistemic uncertainty can be used. In this article, a fuzzy β factor is considered to represent the failure dependence uncertainties among basic events. The relationship between β factor and system failure probability is analyzed to support the use of a hybrid probabilistic–possibilistic approach. As a result, a complete hybrid probabilistic–possibilistic framework is constructed. A case study of a high integrity pressure protection system is discussed. The results show that the proposed method provides decision makers a more accurate understanding of the system under analysis when failure dependencies are involved. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

17.
Quantifying uncertainty during risk analysis has become an important part of effective decision-making and health risk assessment. However, most risk assessment studies struggle with uncertainty analysis and yet uncertainty with respect to model parameter values is of primary importance. Capturing uncertainty in risk assessment is vital in order to perform a sound risk analysis. In this paper, an approach to uncertainty analysis based on the fuzzy set theory and the Monte Carlo simulation is proposed. The question then arises as to how these two modes of representation of uncertainty can be combined for the purpose of estimating risk. The proposed method is applied to a propylene oxide polymerisation reactor. It takes into account both stochastic and epistemic uncertainties in the risk calculation. This study explores areas where random and fuzzy logic models may be applied to improve risk assessment in industrial plants with a dynamic system (change over time). It discusses the methodology and the process involved when using random and fuzzy logic systems for risk management.  相似文献   

18.
Climate change impacts and adaptation assessments have traditionally adopted a scenario-based approach, which precludes an assessment of the relative risks of particular adaptation options. Probabilistic impact assessments, especially if based on a thorough analysis of the uncertainty in an impact forecast system, enable adoption of a risk-based assessment framework. However, probabilistic impacts information is conditional and will change over time. We explore the implications of a probabilistic end-to-end risk-based framework for climate impacts assessment, using the example of water resources in the Thames River, UK. We show that a probabilistic approach provides more informative results that enable the potential risk of impacts to be quantified, but that details of the risks are dependent on the approach used in the analysis.  相似文献   

19.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

20.
Snow stability, or the probability of avalanche release, is one of the key factors defining avalanche danger. Most snow stability evaluations are based on field observations, which are time-consuming and sometimes dangerous. Through numerical modelling of the snow cover stratigraphy, the problem of having sparsely measured regional stability information can be overcome. In this study we compared numerical model output with observed stability. Overall, 775 snow profiles combined with Rutschblock scores and release types for the area surrounding five weather stations were rated into three stability classes. Snow stratigraphy data were then produced for the locations of these five weather stations using the snow cover model SNOWPACK. We observed that (i) an existing physically based stability interpretation implemented in SNOWPACK was applicable for regional stability evaluation; (ii) modelled variables equivalent to those manually observed variables found to be significantly discriminatory with regard to stability, did not demonstrated equal strength of classification; (iii) additional modelled variables that cannot be measured in the field discriminated well between stability categories. Finally, with objective feature selection, a set of variables was chosen to establish an optimal link between the modelled snow stratigraphy data and the stability rating through the use of classification trees. Cross-validation was then used to assess the quality of the classification trees. A true skill statistic of 0.5 and 0.4 was achieved by two models that detected “rather stable” or “rather unstable” conditions, respectively. The interpretation derived could be further developed into a support tool for avalanche warning services for the prediction of regional avalanche danger.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号