首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The traditional reliability analysis method based on probabilistic method requires probability distributions of all the uncertain parameters. However, in practical applications, the distributions of some parameters may not be precisely known due to the lack of sufficient sample data. The probabilistic theory cannot directly measure the reliability of structures with epistemic uncertainty, ie, subjective randomness and fuzziness. Hence, a hybrid reliability analysis (HRA) problem will be caused when the aleatory and epistemic uncertainties coexist in a structure. In this paper, by combining the probability theory and the uncertainty theory into a chance theory, a probability‐uncertainty hybrid model is established, and a new quantification method based on the uncertain random variables for the structural reliability is presented in order to simultaneously satisfy the duality of random variables and the subadditivity of uncertain variables; then, a reliability index is explored based on the chance expected value and variance. Besides, the formulas of the chance theory‐based reliability and reliability index are derived to uniformly assess the reliability of structures under the hybrid aleatory and epistemic uncertainties. The numerical experiments illustrate the validity of the proposed method, and the results of the proposed method can provide a more accurate assessment of the structural system under the mixed uncertainties than the ones obtained separately from the probability theory and the uncertainty theory.  相似文献   

2.
Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster-Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.  相似文献   

3.
This article proposes a new method for hybrid reliability-based design optimization under random and interval uncertainties (HRBDO-RI). In this method, Monte Carlo simulation (MCS) is employed to estimate the upper bound of failure probability, and stochastic sensitivity analysis (SSA) is extended to calculate the sensitivity information of failure probability in HRBDO-RI. Due to a large number of samples involved in MCS and SSA, Kriging metamodels are constructed to substitute true constraints. To avoid unnecessary computational cost on Kriging metamodel construction, a new screening criterion based on the coefficient of variation of failure probability is developed to judge active constraints in HRBDO-RI. Then a projection-outline-based active learning Kriging is achieved by sequentially select update points around the projection outlines on the limit-state surfaces of active constraints. Furthermore, the prediction uncertainty of Kriging metamodel is quantified and considered in the termination of Kriging update. Several examples, including a piezoelectric energy harvester design, are presented to test the accuracy and efficiency of the proposed method for HRBDO-RI.  相似文献   

4.
Epistemic and aleatory uncertain variables always exist in multidisciplinary system simultaneously and can be modeled by probability and evidence theories, respectively. The propagation of uncertainty through coupled subsystem and the strong nonlinearity of the multidisciplinary system make the reliability analysis difficult and computational cost expensive. In this paper, a novel reliability analysis procedure is proposed for multidisciplinary system with epistemic and aleatory uncertain variables. First, the probability density function of the aleatory variables is assumed piecewise uniform distribution based on Bayes method, and approximate most probability point is solved by equivalent normalization method. Then, important sampling method is used to calculate failure probability and its variance and variation coefficient. The effectiveness of the procedure is demonstrated by two numerical examples. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

5.
There will be simplifying assumptions and idealizations in the availability models of complex processes and phenomena. These simplifications and idealizations generate uncertainties which can be classified as aleatory (arising due to randomness) and/or epistemic (due to lack of knowledge). The problem of acknowledging and treating uncertainty is vital for practical usability of reliability analysis results. The distinction of uncertainties is useful for taking the reliability/risk informed decisions with confidence and also for effective management of uncertainty. In level-1 probabilistic safety assessment (PSA) of nuclear power plants (NPP), the current practice is carrying out epistemic uncertainty analysis on the basis of a simple Monte-Carlo simulation by sampling the epistemic variables in the model. However, the aleatory uncertainty is neglected and point estimates of aleatory variables, viz., time to failure and time to repair are considered. Treatment of both types of uncertainties would require a two-phase Monte-Carlo simulation, outer loop samples epistemic variables and inner loop samples aleatory variables. A methodology based on two-phase Monte-Carlo simulation is presented for distinguishing both the kinds of uncertainty in the context of availability/reliability evaluation in level-1 PSA studies of NPP.  相似文献   

6.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

7.
This study proposes a data-driven method for assessing reliability, based on the scarce input dataset with multidimensional correlation. Since considering the distribution parameters estimated from the scarce dataset as those of the population may lead to epistemic uncertainty, the bootstrap resampling algorithm is adopted to infer the distribution parameters as interval parameters. To account for the variable dependence, vine copula theory is utilized to construct the joint probability density function (PDF) of input variables, and maximum likelihood estimation (MLE) and Akaike information criterion (AIC) analysis are employed to select optimal copulas based on the samples for the vine structure. Subsequently, the failure probability bounds of a response function are calculated based on the constructed joint PDF with interval distribution parameters by the active learning Kriging (AK) method combining the sparse grid integration (SGI) method. Finally, several examples are provided to demonstrate the feasibility and efficiency of the proposed method.  相似文献   

8.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

9.
Optimization leads to specialized structures which are not robust to disturbance events like unanticipated abnormal loading or human errors. Typical reliability-based and robust optimization mainly address objective aleatory uncertainties. To date, the impact of subjective epistemic uncertainties in optimal design has not been comprehensively investigated. In this paper, we use an independent parameter to investigate the effects of epistemic uncertainties in optimal design: the latent failure probability. Reliability-based and risk-based truss topology optimization are addressed. It is shown that optimal risk-based designs can be divided in three groups: (A) when epistemic uncertainty is small (in comparison to aleatory uncertainty), the optimal design is indifferent to it and yields isostatic structures; (B) when aleatory and epistemic uncertainties are relevant, optimal design is controlled by epistemic uncertainty and yields hyperstatic but nonredundant structures, for which expected costs of direct collapse are controlled; (C) when epistemic uncertainty becomes too large, the optimal design becomes redundant, as a way to control increasing expected costs of collapse. The three regions above are divided by hyperstatic and redundancy thresholds. The redundancy threshold is the point where the structure needs to become redundant so that its reliability becomes larger than the latent reliability of the simplest isostatic system. Simple truss topology optimization is considered herein, but the conclusions have immediate relevance to the optimal design of realistic structures subject to aleatory and epistemic uncertainties.  相似文献   

10.
Assessing the failure probability of a thermal–hydraulic (T–H) passive system amounts to evaluating the uncertainties in its performance. Two different sources of uncertainties are usually considered: randomness due to inherent variability in the system behavior (aleatory uncertainty) and imprecision due to lack of knowledge and information on the system (epistemic uncertainty).In this paper, we are concerned with the epistemic uncertainties affecting the model of a T–H passive system and the numerical values of its parameters. Due to these uncertainties, the system may find itself in working conditions that do not allow it to accomplish its functions as required. The estimation of the probability of these functional failures can be done by Monte Carlo (MC) sampling of the epistemic uncertainties affecting the model and its parameters, followed by the computation of the system function response by a mechanistic T–H code.Efficient sampling methods are needed for achieving accurate estimates, with reasonable computational efforts. In this respect, the recently developed Line Sampling (LS) method is here considered for improving the MC sampling efficiency. The method, originally developed to solve high-dimensional structural reliability problems, employs lines instead of random points in order to probe the failure domain of interest. An “important direction” is determined, which points towards the failure domain of interest; the high-dimensional reliability problem is then reduced to a number of conditional one-dimensional problems which are solved along the “important direction”. This allows to significantly reduce the variance of the failure probability estimator, with respect to standard random sampling.The efficiency of the method is demonstrated by comparison to the commonly adopted Latin Hypercube Sampling (LHS) and first-order reliability method (FORM) in an application of functional failure analysis of a passive decay heat removal system in a gas-cooled fast reactor (GFR) of literature.  相似文献   

11.
This paper develops a novel computational framework to compute the Sobol indices that quantify the relative contributions of various uncertainty sources towards the system response prediction uncertainty. In the presence of both aleatory and epistemic uncertainty, two challenges are addressed in this paper for the model-based computation of the Sobol indices: due to data uncertainty, input distributions are not precisely known; and due to model uncertainty, the model output is uncertain even for a fixed realization of the input. An auxiliary variable method based on the probability integral transform is introduced to distinguish and represent each uncertainty source explicitly, whether aleatory or epistemic. The auxiliary variables facilitate building a deterministic relationship between the uncertainty sources and the output, which is needed in the Sobol indices computation. The proposed framework is developed for two types of model inputs: random variable input and time series input. A Bayesian autoregressive moving average (ARMA) approach is chosen to model the time series input due to its capability to represent both natural variability and epistemic uncertainty due to limited data. A novel controlled-seed computational technique based on pseudo-random number generation is proposed to efficiently represent the natural variability in the time series input. This controlled-seed method significantly accelerates the Sobol indices computation under time series input, and makes it computationally affordable.  相似文献   

12.
This article reports a brand-new methodology based on active learning Kriging model for hybrid reliability analysis (HRA) with both random and interval variables. Unlike probabilistic reliability analysis, the limit state surface (LSS) of HRA is projected into a banded region in the domain of random variables. Only approximating the bounds of the banded region is able to meet the accuracy requirement of HRA. In the proposed methodology, the HRA problem is innovatively transformed into a traditional system reliability analysis (SRA) problem with numerous failure modes. And then a basic idea from the field of SRA is borrowed into HRA, and the so-called truncated candidate region (TCR) for HRA is proposed. In each iteration, the negligible region which probably does not influence the bounds estimation of failure probability is truncated from the original candidate region, and the optimal training point is chosen from the TCR. After several iterations, the TCR will converge to the true ideal candidate region, that is, the candidate region without the inner part of LSS, and the added training points will be driven to the region around the bounds of LSS. The performance of the proposed method is compared with relevant methods by five case studies.  相似文献   

13.
For addressing the low efficiency of structural reliability analysis under the random-interval mixed uncertainties (RIMU), this paper establishes the line sampling method (LS) under the RIMU. The proposed LS divides the reliability analysis under RIMU into two stages. The Markov chain simulation is used to efficiently search the design point under RIMU in the first stage, then the upper and lower bounds of failure probability are estimated by LS in the second stage. To improve the computational efficiency of the proposed LS under RIMU, the Kriging model is employed to reduce the model evaluation numbers in the two stages. For efficiently searching the design point, the Kriging model is constructed and adaptively updated in the first stage to accurately recognize the Markov chain candidate state, and then it is sequentially updated by the improved U learning function in the second stage to accurately estimate the failure probability bounds. The proposed LS under RIMU with Kriging model can not only reduce the model evaluation numbers but also decrease the candidate sample pool size for constructing the Kriging model in two stages. The presented examples demonstrate the superior computational efficiency and accuracy of the proposed method by comparison with some existing methods.  相似文献   

14.
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project.  相似文献   

15.
By means of several examples from a recent comprehensive space nuclear risk analysis of the Cassini mission, a scenario and consequence representational framework is presented for risk analysis of space nuclear power systems in the context of epistemic and aleatory uncertainties. The framework invites the use of probabilistic models for the calculation of both event probabilities and scenario consequences. Each scenario is associated with a frequency that may include both aleatory and epistemic uncertainties. The outcome of each scenario is described in terms of an end state vector. The outcome of each scenario is also characterized by a source term. In this paper, the source term factors of interest are number of failed clads in the space nuclear power system, amount of fuel released and amount of fuel that is potentially respirable. These are also subject to uncertainties. The 1990 work of Apostolakis is found to be a useful formalism from which to derive the relevant probabilistic models. However, an extension to the formalism was necessary to accommodate the situation in which aleatory uncertainty is represented by changes in the form of the probability function itself, not just its parameters. Event trees that show reasonable alternative accident scenarios are presented. A grouping of probabilities and consequences is proposed as a useful structure for thinking about uncertainties. An example of each category is provided. Concluding observations are made about the judgments involved in this analysis of uncertainties and the effect of distinguishing between aleatory and epistemic uncertainties.  相似文献   

16.
马君明  李惠  兰成明  刘彩平 《工程力学》2022,39(3):11-22, 63
该文着重研究基于观测信息的结构体系可靠度更新模型及其拒绝抽样算法。基于Bayesian理论建立考虑观测信息的结构体系失效概率更新模型,根据观测信息事件类型建立不等式和等式观测信息条件下随机变量的似然函数并推导其后验概率密度函数;基于观测信息域确定随机变量后验样本的拒绝抽样策略,探究拒绝抽样算法的抽样效率,推导更新后结构体系失效概率估计值及其标准差的计算公式;将上述方法应用于刚架结构发生塑性失效时体系可靠度更新计算。研究表明:考虑观测信息的结构体系条件失效概率更新模型可转化为随机变量后验概率密度在失效域上的积分,构造满足观测信息域的先验样本作为随机变量后验样本的抽样策略是可行的,该抽样策略可以处理多随机变量、多观测信息条件下结构体系可靠度更新;与抗力相关随机变量检测值增大及验证荷载值提高均可以降低更新后结构体系的失效概率,与抗力相关的随机变量还需控制其检测误差的标准差,以降低观测信息的不确定性。  相似文献   

17.
This paper develops a methodology to assess the validity of computational models when some quantities may be affected by epistemic uncertainty. Three types of epistemic uncertainty regarding input random variables - interval data, sparse point data, and probability distributions with parameter uncertainty - are considered. When the model inputs are described using sparse point data and/or interval data, a likelihood-based methodology is used to represent these variables as probability distributions. Two approaches - a parametric approach and a non-parametric approach - are pursued for this purpose. While the parametric approach leads to a family of distributions due to distribution parameter uncertainty, the principles of conditional probability and total probability can be used to integrate the family of distributions into a single distribution. The non-parametric approach directly yields a single probability distribution. The probabilistic model predictions are compared against experimental observations, which may again be point data or interval data. A generalized likelihood function is constructed for Bayesian updating, and the posterior distribution of the model output is estimated. The Bayes factor metric is extended to assess the validity of the model under both aleatory and epistemic uncertainty and to estimate the confidence in the model prediction. The proposed method is illustrated using a numerical example.  相似文献   

18.
钢筋混凝土柱的“强剪弱弯”可靠性区间分析   总被引:1,自引:0,他引:1  
易伟建  李浩 《工程力学》2007,24(9):72-79
在钢筋混凝土结构抗震设计中,"强剪弱弯"是保证结构延性的一个重要设计概念。引进区间变量表达认知不确定性,对钢筋混凝土框架柱进行失效概率区间分析。通过结合代表认知不确定性的区间变量与代表偶遇不确定性的随机变量完成了对不确定性的数学描述。在此基础上,根据对基本事件的包含关系建立"强剪弱弯"区间可靠性概率模型,并从证据理论出发论证了该失效概率区间的上下界实质上等价于证据理论中的信任与似然函数。对于含有区间值不确定性参数的结构承载力计算,将Berz-Taylor模型引进计算过程中,减少由于区间扩张而导致的误差。在数值模拟计算中,运用模拟退火遗传算法(SAGA)确定了"强剪弱弯"的大致设计区间。根据该设计区间构造了特殊的采样函数进行重要性采样模拟从而得到了失效概率区间。误差分析表明该方法具有较好的精度。最后通过算例分析了各设计因素对"强剪弱弯"可靠性的影响,并提出了相应的设计建议。  相似文献   

19.
A new generalized probabilistic approach of uncertainties is proposed for computational model in structural linear dynamics and can be extended without difficulty to computational linear vibroacoustics and to computational non‐linear structural dynamics. This method allows the prior probability model of each type of uncertainties (model‐parameter uncertainties and modeling errors) to be separately constructed and identified. The modeling errors are not taken into account with the usual output‐prediction‐error method, but with the nonparametric probabilistic approach of modeling errors recently introduced and based on the use of the random matrix theory. The theory, an identification procedure and a numerical validation are presented. Then a chaos decomposition with random coefficients is proposed to represent the prior probabilistic model of random responses. The random germ is related to the prior probability model of model‐parameter uncertainties. The random coefficients are related to the prior probability model of modeling errors and then depends on the random matrices introduced by the nonparametric probabilistic approach of modeling errors. A validation is presented. Finally, a future perspective is introduced when experimental data are available. The prior probability model of the random coefficients can be improved in constructing a posterior probability model using the Bayesian approach. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

20.
This paper deals with the stochastic post buckling response the functionally graded material (FGMs) beam with surface bonded piezoelectric layers subjected to thermoelectromechanical loadings. A C0 nonlinear finite element method using higher order shear deformation theory with von-Karman nonlinearity is used for basic formulation. The random system parameter such as material properties of FGM and piezoelectric layers and thermoelectromechanical loadings are modeled as uncorrelated random input variables. The first and second order perturbation method and Monte Carlo sampling (MCS) are proposed to examine the mean, coefficient of variation, probability distribution function and probability of failure of critical post buckling load. Typical numerical results are presented for volume fraction indexes, slenderness ratios, boundary conditions, piezoelectric layers and thermoelectromechanical loadings with random system properties. The present outlined approach is validated with the results available in the literature and by employing MCS.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号