首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

2.
Epistemic and aleatory uncertain variables always exist in multidisciplinary system simultaneously and can be modeled by probability and evidence theories, respectively. The propagation of uncertainty through coupled subsystem and the strong nonlinearity of the multidisciplinary system make the reliability analysis difficult and computational cost expensive. In this paper, a novel reliability analysis procedure is proposed for multidisciplinary system with epistemic and aleatory uncertain variables. First, the probability density function of the aleatory variables is assumed piecewise uniform distribution based on Bayes method, and approximate most probability point is solved by equivalent normalization method. Then, important sampling method is used to calculate failure probability and its variance and variation coefficient. The effectiveness of the procedure is demonstrated by two numerical examples. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

3.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

4.
Optimization leads to specialized structures which are not robust to disturbance events like unanticipated abnormal loading or human errors. Typical reliability-based and robust optimization mainly address objective aleatory uncertainties. To date, the impact of subjective epistemic uncertainties in optimal design has not been comprehensively investigated. In this paper, we use an independent parameter to investigate the effects of epistemic uncertainties in optimal design: the latent failure probability. Reliability-based and risk-based truss topology optimization are addressed. It is shown that optimal risk-based designs can be divided in three groups: (A) when epistemic uncertainty is small (in comparison to aleatory uncertainty), the optimal design is indifferent to it and yields isostatic structures; (B) when aleatory and epistemic uncertainties are relevant, optimal design is controlled by epistemic uncertainty and yields hyperstatic but nonredundant structures, for which expected costs of direct collapse are controlled; (C) when epistemic uncertainty becomes too large, the optimal design becomes redundant, as a way to control increasing expected costs of collapse. The three regions above are divided by hyperstatic and redundancy thresholds. The redundancy threshold is the point where the structure needs to become redundant so that its reliability becomes larger than the latent reliability of the simplest isostatic system. Simple truss topology optimization is considered herein, but the conclusions have immediate relevance to the optimal design of realistic structures subject to aleatory and epistemic uncertainties.  相似文献   

5.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

6.
In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSA's mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commission's (NRC's) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses.  相似文献   

7.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

8.
9.
The response of a structure with random parameters (e.g. stiffness, or damping coefficients) is investigated, when it is subjected to a random load. The structural behaviour can be either linear or non‐linear, but the forcing process has been assumed to be second‐order Gaussian. A modified series expansion is used to determine response statistics, in the various cases of a Duffing oscillator with random stiffness and damping coefficients being subjected to a deterministic sinusoidal force, a random amplitude sinusoidal force, a Kanai–Taijimi‐type earthquake. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

10.
在分析结构的随机振动响应时,响应面法(Response Surface Method)可有效地降低随机仿真的计算代价。然而,当随机变量存在大变异系数时,传统的响应面法无法满足所要求的精度。分片响应面基于对随机变量变异系数进行合理分块的原则,缩小响应面的近似范围,并对分割后的响应面进行独立分析,从而提高了响应面在该空间的近似精度。首先采用分块响应面法结合蒙特卡洛MC抽样技术,以单质点振子模型的随机振动响应为算例,对分块响应面法的正确性进行验证。计算结果表明,在随机变量的变异系数不大时,分片响应面法的计算精度不低于传统响应面法,而当随机变量具有大变异系数时,分片响应面法的近似精度远高于传统响应面法。此外,以随机地震动作用下的桥墩随机振动为背景,将该方法进行了进一步地推广及应用。  相似文献   

11.
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, “Quantification of Margins and Uncertainties: Conceptual and Computational Basis,” describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.  相似文献   

12.
研究了模糊随机参数桁架结构在模糊随机荷载激励下的复合模糊随机振动动力响应的问题。同时考虑结构的物理参数、几何尺寸和外载荷幅值的模糊随机性,从Duham e l积分式出发,利用振型迭加法求出了结构动力响应模糊随机变量的表达式;再由随机函数的矩法推导出结构模糊随机动力响应的模糊数字特征。最后,通过算例考察了结构参数和作用荷载的模糊随机性对结构动力响应的影响,并用M on te C arlo数值法对算例进行模拟,验证了文中模型和分析方法是可行有效的。  相似文献   

13.
A stochastic boundary element method (SBEM) is developed in this work for evaluating the dynamic response of underground openings excited by seismically induced, horizontally polarized shear waves under steady-state conditions. The surrounding geological medium is viewed as an elastic continuum exhibiting large randomness in its mechanical properties, which implies that the wave number of the propagating signal is a function of a random variable. Suitable Green's functions are proposed and used within the context of the SBEM formulation. More specifically, a series expansion for the Green's functions is employed, where the basis functions are orthogonal polynomials of a random argument (polynomial chaos). These are subsequently incorporated in the SBEM formulation, which employs the usual quadratic, isoparametric line elements for modeling the surfaces of the problem in question. Finally, this formulation is used for the solution of a few problems of engineering interest involving buried cavities (tunnels). We note that the present approach departs from earlier boundary element derivations based on perturbations, which are valid for ‘small’ amounts of randomness in the elastic continuum.  相似文献   

14.
Performance assessment of complex systems is ideally done through full system-level testing which is seldom available for high consequence systems. Further, a reality of engineering practice is that some features of system behavior are not known from experimental data, but from expert assessment, only. On the other hand, individual component data, which are part of the full system are more readily available. The lack of system level data and the complexity of the system lead to a need to build computational models of a system in a hierarchical or building block approach (from simple components to the full system). The models are then used for performance prediction in lieu of experiments, to estimate the confidence in the performance of these systems. Central to this are the need to quantify the uncertainties present in the system and to compare the system response to an expected performance measure. This is the basic idea behind Quantification of Margins and Uncertainties (QMU). QMU is applied in decision making—there are many uncertainties caused by inherent variability (aleatoric) in materials, configurations, environments, etc., and lack of information (epistemic) in models for deterministic and random variables that influence system behavior and performance. This paper proposes a methodology to quantify margins and uncertainty in the presence of both aleatoric and epistemic uncertainty. It presents a framework based on Bayes networks to use available data at multiple levels of complexity (i.e. components, subsystem, etc.) and demonstrates a method to incorporate epistemic uncertainty given in terms of intervals on a model parameter.  相似文献   

15.
A mathematical model of damage evolution in heterogeneous materials, which takes into account the random nature of local failure, is developed on the basis of the theory of stochastic equations. A damage evolution law, which allows for the energy dissipation due to the new surface formation as well as the influence of local (thermic) fluctuations is obtained. The kinetic differential equation for time-dependent probability distribution of a damage parameter is derived theoretically. Damage evolution and damage localization under dynamical loading are investigated numerically on the basis of the model developed.  相似文献   

16.
There are inherent uncertainties in the biodiesel production process arising out of feedstock composition, operating and design parameters and can have significant impact on the product quality and process economics. In this paper, the uncertainties are quantified in the form of probabilistic distribution function. Stochastic modeling capability is implemented in the ASPEN process simulator to take into consideration these uncertainties and the output is evaluated to determine impact on process efficiency and quality of biodiesel.  相似文献   

17.
This article proposes a new method for hybrid reliability-based design optimization under random and interval uncertainties (HRBDO-RI). In this method, Monte Carlo simulation (MCS) is employed to estimate the upper bound of failure probability, and stochastic sensitivity analysis (SSA) is extended to calculate the sensitivity information of failure probability in HRBDO-RI. Due to a large number of samples involved in MCS and SSA, Kriging metamodels are constructed to substitute true constraints. To avoid unnecessary computational cost on Kriging metamodel construction, a new screening criterion based on the coefficient of variation of failure probability is developed to judge active constraints in HRBDO-RI. Then a projection-outline-based active learning Kriging is achieved by sequentially select update points around the projection outlines on the limit-state surfaces of active constraints. Furthermore, the prediction uncertainty of Kriging metamodel is quantified and considered in the termination of Kriging update. Several examples, including a piezoelectric energy harvester design, are presented to test the accuracy and efficiency of the proposed method for HRBDO-RI.  相似文献   

18.
Atmospheric corrosion of metals is the most common type of corrosion which has a significant impact on the environment and operational safety in various situations of everyday life.Some of the common examples can be observed in land,water and air transportation systems,electronic circuit boards,urban and offshore infrastructures.The dew drops formed on metal surface due to condensation of atmospheric moisture facilitates corrosion as an electrolyte.The corrosion mechanisms under these droplets are different from classically known bulk electrolyte corrosion.Due to thin and non-uniform geometric thickness of the droplet electrolyte,the atmospheric oxygen requires a shorter diffusion path to reach the metal surface.The corrosion under a droplet is driven by the depletion of oxygen in the center of the droplet compared to the edge,known as differential aeration.In case of a larger droplet,differential aeration leads to preferential cathodic activity at the edge and is controlled by the droplet geometry.Whereas,for a smaller droplet,the oxygen concentration remains uniform and hence cathodic activity is not controlled by droplet geometry.The geometry of condensed droplets varies dynamically with changing environmental parameters,influencing corrosion mechanisms as the droplets evolve in size.In this review,various modelling approaches used to simulate the corrosion under droplet electrolytes are presented.In the efforts of developing a comprehensive model to estimate corrosion rates,it has been noted from this review that the influence of geometric evolution of the droplet due to condensation/evaporation processes on corrosion mechanisms are yet to be modelled.Dynamically varying external factors like environmental temperature,relative humidity,presence of hygroscopic salts and pollutants influence the evolution of droplet electrolyte,making it a complex phenomenon to investigate.Therefore,an overview of available dropwise condensation and evaporation models which describes the formation and the evolution of droplet geometry are also presented from an atmo s pheric corrosion viewpoint.  相似文献   

19.
根据有限元分析得到单塔的非线性刚度软化特征,结合平均风下导线的动力特性,依据Lagrange方程建立了简化的输电塔线耦联系统的MDOF模型,将其应用于统计线性化方法,计算了系统在强风荷载下的非线性随机动力响应频域近似解。与数值模拟计算结果的比较显示,该方法的计算结果不仅具备一定的精度,还可以对系统动力失稳的临界风荷载作出判断。  相似文献   

20.
以杨木纤维(WF)为增强材料,以高密度聚乙烯(HDPE)为基体,马来酸酐接枝聚乙烯(MAPE)为偶联剂,采用熔融挤出法制备了 WF/HDPE复合材料.选取WF含量、偶联剂添加量、挤出温度为自变量,试件的抗冲击强度、弯曲强度、拉伸强度为响应值,采用Box-BehnkenDesign方法设计实验并利用响应曲面法建立WF/H...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号