首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the reliability-based design of engineering systems, it is often required to evaluate the failure probability for different values of distribution parameters involved in the specification of design configuration. The failure probability as a function of the distribution parameters is referred as the ‘failure probability function (FPF)’ in this work. From first principles, this problem requires repeated reliability analyses to estimate the failure probability for different distribution parameter values, which is a computationally expensive task. A “weighted approach” is proposed in this work to locally evaluate the FPF efficiently by means of a single simulation. The basic idea is to rewrite the failure probability estimate for a given set of random samples in simulation as a function of the distribution parameters. It is shown that the FPF can be written as a weighted sum of sample values. The latter must be evaluated by system analysis (the most time-consuming task) but they do not depend on the distribution. Direct Monte Carlo simulation, importance sampling and Subset Simulation are incorporated under the proposed approach. Examples are given to illustrate their application.  相似文献   

2.
An efficient strategy to approximate the failure probability function in structural reliability problems is proposed. The failure probability function (FPF) is defined as the failure probability of the structure expressed as a function of the design parameters, which in this study are considered to be distribution parameters of random variables representing uncertain model quantities. The task of determining the FPF is commonly numerically demanding since repeated reliability analyses are required. The proposed strategy is based on the concept of augmented reliability analysis, which only requires a single run of a simulation-based reliability method. This paper introduces a new sample regeneration algorithm that allows to generate the required failure samples of design parameters without any additional evaluation of the structural response. In this way, efficiency is further improved while ensuring high accuracy in the estimation of the FPF. To illustrate the efficiency and effectiveness of the method, case studies involving a turbine disk and an aircraft inner flap are included in this study.  相似文献   

3.
基于最大熵原则和灰度变换的图像增强   总被引:3,自引:0,他引:3  
章秀华  杨坤涛 《光电工程》2007,34(2):84-87,119
提出了一种利用最大熵原则和灰度变换进行图像对比度增强的方法.在最大熵原则基础上利用条件迭代算法对图像灰度级进行最佳分类,对各分类区域进行相应的灰度变换,根据不同需要选取变换参数,在图像对比度增强同时各区域均衡性也得到很大改善.将利用条件迭代算法计算最大熵多阈值的方法与最小均方误差(LMSE)计算多阈值的方法进行比较,实验结果表明,文中所用方法在迭代次数上大大低于基于最小均方误差算法所需迭代次数,节省了图像处理时间,图像均衡化效果也相对提高.  相似文献   

4.
The non-Gaussian simulation method using Hermite polynomials expansion presented in a previous article is improved. It is aimed to simulate the paths of a strictly stationary non-Gaussian process given the N-first moments of its one-dimensional marginal distribution and its autocorrelation function. The present new model consists in using the maximum entropy principle in order to determine its marginal distribution. This allows to obtain new results of convergence. The convergences of this new model are then examined and the method is illustrated by some examples.  相似文献   

5.
The problem of estimating the crystallite orientation distribution function (codf) based on the leading texture coefficients is considered. Problems of such a type are called moment problems, which are well known in statistical mechanics and other areas of science. It is shown how the maximum entropy method can be applied to estimate the codf. Special emphasis is given to a coordinate-free formulation of the problem. The codf is represented by a tensorial Fourier series. The equations, which have to be solved for the estimate of the distribution function, are derived for all tensor ranks of the Fourier coefficients. As a numerical example, a model codf is estimated based on a set of discrete crystal orientations given by a full-constrained Taylor type texture simulation.  相似文献   

6.
In this study, a Reliability-Based Optimization (RBO) methodology that uses Monte Carlo Simulation techniques, is presented. Typically, the First Order Reliability Method (FORM) is used in RBO for failure probability calculation and this is accurate enough for most practical cases. However, for highly nonlinear problems it can provide extremely inaccurate results and may lead to unreliable designs. Monte Carlo Simulation (MCS) is usually more accurate than FORM but very computationally intensive. In the RBO methodology presented in this paper, limit state approximations are used in conjunction with MCS techniques in an approximate MCS-based RBO that facilitates the efficient calculation of the probabilities of failure. A FORM-based RBO is first performed to obtain the initial limit state approximations. A Symmetric Rank-1 (SR1) variable metric algorithm is used to construct and update the quadratic limit state approximations. The approximate MCS-based RBO uses a conditional-expectation-based MCS, that was chosen over indicator-based MCS because of the smoothness of the probability of failure estimates and the availability of analytic sensitivities. The RBO methodology was implemented for an analytic test problem and a higher-dimensional, control-augmented-structure test problem. The results indicate that the SR1 algorithm provides accurate limit state approximations (and therefore accurate estimates of the probabilities of failure) for these test problems. It was also observed that the RBO methodology required two orders of magnitude fewer analysis calls than an approach that used exact limit state evaluations for both test problems.  相似文献   

7.
本文提出了不确定度分析中常用的正态分布置信概率、置信区间的几种近似算法。并根据这些算法的数学模型设计了计算程序,对实际计算结果进行了验证。  相似文献   

8.
介绍了最大熵原理和扩展最大熵原理,得到了不确定度中的正态分布、矩形分布和曲线梯形分布,分析和应用了曲线梯形分布.  相似文献   

9.
The construction of probabilistic models in computational mechanics requires the effective construction of probability distributions of random variables in high dimension. This paper deals with the effective construction of the probability distribution in high dimension of a vector‐valued random variable using the maximum entropy principle. The integrals in high dimension are then calculated in constructing the stationary solution of an Itô stochastic differential equation associated with its invariant measure. A random generator of independent realizations is explicitly constructed in this paper. Three fundamental applications are presented. The first one is a new formulation of the stochastic inverse problem related to the construction of the probability distribution in high dimension of an unknown non‐stationary random time series (random accelerograms) for which the velocity response spectrum is given. The second one is also a new formulation related to the construction of the probability distribution of positive‐definite band random matrices. Finally, we present an extension of the theory when the support of the probability distribution is not all the space but is any part of the space. The third application is then a new formulation related to the construction of the probability distribution of the Karhunen–Loeve expansion of non‐Gaussian positive‐valued random fields. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

10.
The maximum entropy principle constrained by probability weighted moments is an useful technique for unbiasedly and efficiently estimating the quantile function of a random variable from a sample of complete observations. However, censored or incomplete data are often encountered in engineering reliability and lifetime distribution analysis. This paper presents a new distribution free method for the estimation of the quantile function of a non-negative random variable using a censored sample of data, which is based on the principle of partial maximum entropy (MaxEnt) in which partial probability weighted moments (PPWMs) are used as constraints. Numerical results and practical examples presented in the paper confirm the accuracy and efficiency of the proposed partial MaxEnt quantile function estimation method for censored samples.  相似文献   

11.
A hybrid Subset Simulation approach is proposed for reliability estimation for general dynamical systems subject to stochastic excitation. This new stochastic simulation approach combines the advantages of the two previously proposed Subset Simulation methods, Subset Simulation with Markov Chain Monte Carlo (MCMC) algorithm and Subset Simulation with splitting. The new method employs the MCMC algorithm before reaching an intermediate failure level and splitting after reaching the level to exploit the causality of dynamical systems. The statistical properties of the failure probability estimators are derived. Two examples are presented to demonstrate the effectiveness of the new approach and to compare with the previous two Subset Simulation methods. The results show that the new method is robust to the choice of proposal distribution for the MCMC algorithm and to the intermediate failure events selected for Subset Simulation.  相似文献   

12.
In this paper, a novel method to determine the distribution of a random variable from a sample of data is presented. The approach is called generalized kernel density maximum entropy method, because it adopts a kernel density representation of the target distribution, while its free parameters are determined through the principle of maximum entropy (ME). Here, the ME solution is determined by assuming that the available information is represented from generalized moments, which include as their subsets the power and the fractional ones. The proposed method has several important features: (1) applicable to distributions with any kind of support, (2) computational efficiency because the ME solution is simply obtained as a set of systems of linear equations, (3) good trade‐off between bias and variance, and (4) good estimates of the tails of the distribution, in the presence of samples of small size. Moreover, the joint application of generalized kernel density maximum entropy with a bootstrap resampling allows to define credible bounds of the target distribution. The method is first benchmarked through an example of stochastic dynamic analysis. Subsequently, it is used to evaluate the seismic fragility functions of a reinforced concrete frame, from the knowledge of a small set of available ground motions.  相似文献   

13.
Using gaussian quadrature we can find m concentrations of probability that replace the density function of a random variable X and match 2m - 1 of its moments. This reduces a probabilistic analysis to m deterministic ones. Even small values of m provide excellent accuracy in many practical circumstances. When fewer than 2m - 1 moments are known there is arbitrariness in the choice of the concentrations, which is overcome by resorting to the maximum entropy formalism. Its use is here systematized for the case in which αXb and we know N moments of the density of X, so that calculation of N - 1 integrals suffices for finding the density function and any number of its moments. The approach is illustrated for m = 2 and 3, N = 2, 3, α = 0, B = ∞ and graphs are provided for finding the equivalent concentrations.  相似文献   

14.
The determination of an exact distribution function of a random phenomena is not possible using a limited number of observations. Therefore, in the present paper the stochastic properties of a random variable are assumed as uncertain quantities and instead of predefined distribution types the maximum entropy distribution is used. Efficient methods for a reliability analysis considering these uncertain stochastic parameters are presented. Based on approximation strategies this extended analysis requires no additional limit state function evaluations. Later, variance based sensitivity measures are used to evaluate the contribution of the uncertainty of each stochastic parameter to the total variation of the failure probability.  相似文献   

15.
Over the past decade, the civil engineering community has ever more realized the importance and perspective of reliability-based design optimization (RBDO). Since then several advanced stochastic simulation algorithms for computing small failure probabilities encountered in reliability analysis of engineering systems have been developed: Subset Simulation (Au and Beck (2001) [2]), Line Sampling (Schuëller et al. (2004) [3]), The Auxiliary Domain Method (Katafygiotis et al. (2007) [4]), ALIS (Katafygiotis and Zuev (2007) [5]), etc. In this paper we propose a novel advanced stochastic simulation algorithm for solving high-dimensional reliability problems, called Horseracing Simulation (HRS). The key idea behind HS is as follows. Although the reliability problem itself is high-dimensional, the limit-state function maps this high-dimensional parameter space into a one-dimensional real line. This mapping transforms a high-dimensional random parameter vector, which may represent the stochastic input load as well as any uncertain structural parameters, into a random variable with unknown distribution, which represents the uncertain structural response. It turns out that the corresponding cumulative distribution function (CDF) of this random variable of interest can be accurately approximated by empirical CDFs constructed from specially designed samples. The generation of samples is governed by a process of “racing” towards the failure domain, hence the name of the algorithm. The accuracy and efficiency of the new method are demonstrated with a real-life wind engineering example.  相似文献   

16.
In this research, a new method is proposed to update real-time reliability based on data recorded by instruments and sensors installed on a system. The method is founded on Bayesian analysis and subset simulation and is capable of estimating the functional relationship between the real-time failure probability and the monitoring value. It is shown that as long as the monitoring data can be reasonably deduced into a single index, this relationship can be obtained; moreover, it can be obtained prior to the monitoring process. Three examples of civil engineering systems are used to demonstrate the new method. This new method may be applied to safety monitoring of in-construction civil systems and monitoring of existing civil systems.  相似文献   

17.
A novel subset simulation algorithm, called the parallel subset simulation, is proposed to estimate small failure probabilities of multiple limit states with only a single subset simulation analysis. As well known, crude Monte Carlo simulation is inefficient in estimating small probabilities but is applicable to multiple limit states, while the ordinary subset simulation is efficient in estimating small probabilities but can only handle a single limit state. The proposed novel stochastic simulation approach combines the advantages of the two simulation methods: it is not only efficient in estimating small probabilities but also applicable to multiple limit states. The key idea is to introduce a “principal variable” which is correlated with all performance functions. The failure probabilities of all limit states therefore could be evaluated simultaneously when subset simulation algorithm generates the principal variable samples. The statistical properties of the failure probability estimators are also derived. Two examples are presented to demonstrate the effectiveness of the new approach and to compare with crude Monte Carlo and ordinary subset simulation methods.  相似文献   

18.
It is known that the probability distribution satisfy the Maximum Entropy Principle (MEP) if the available data consist in four moments of probability density function. Two problems are typically associated with use of MEP: the definition of the range of acceptable values for the moments Mi; the evaluation of the coefficients aj. Both problems have already been accurately resolved by analytical procedures when the first two moments of the distribution are known.

In this work, the analytical solution in the case of four known moments is provided and a criterion for confronting the general case (whatever the number of known moments) is expounded. The first four moments are expressed in nondimensional form through the expectation and the coefficients of variation, skewness and kurtosis. The range of their acceptable values is obtained from the analytical properties of the differential equations which govern the problem and from the Schwarz inequality.  相似文献   


19.
The generalized corresponding-states principle (GCSP), based on the properties of two nonspherical reference fluids, has been shown to be a powerful technique for the correlation and prediction of thermodynamic properties. In this work we show GCSP calculations of enthalpy and enropy departures for pure fluids and fluid mixtures. The mixtures studied include those conforming well to traditional corresponding states theory (e.g., n-pentane + n-octane), as well as those that have not hitherto been amenable to such treatments (e.g., n-pentane + ethanol). It is shown that the GCSP method works well for all classes of mixtures and compares favorably with other methods of prediction. The use of cubic equations of state to represent the reference fluids gives the GCSP method flexibility while maintaining accuracy in the prediction. No adjustable parameters are required in the GCSP calculations of enthalpy and entropy departures.  相似文献   

20.
In this paper a maximum entropy characterization is presented for Kotz type symmetric multivariate distributions as well as for multivariate Burr and Pareto type III distributions. Analytical formulae for the Shannon entropy of these multivariate distributions are also derived.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号