首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
The maximum entropy principle constrained by probability weighted moments is an useful technique for unbiasedly and efficiently estimating the quantile function of a random variable from a sample of complete observations. However, censored or incomplete data are often encountered in engineering reliability and lifetime distribution analysis. This paper presents a new distribution free method for the estimation of the quantile function of a non-negative random variable using a censored sample of data, which is based on the principle of partial maximum entropy (MaxEnt) in which partial probability weighted moments (PPWMs) are used as constraints. Numerical results and practical examples presented in the paper confirm the accuracy and efficiency of the proposed partial MaxEnt quantile function estimation method for censored samples.  相似文献   

2.
In this article, a new generalization of the inverse Lindley distribution is introduced based on Marshall-Olkin family of distributions. We call the new distribution, the generalized Marshall-Olkin inverse Lindley distribution which offers more flexibility for modeling lifetime data. The new distribution includes the inverse Lindley and the Marshall-Olkin inverse Lindley as special distributions. Essential properties of the generalized Marshall-Olkin inverse Lindley distribution are discussed and investigated including, quantile function, ordinary moments, incomplete moments, moments of residual and stochastic ordering. Maximum likelihood method of estimation is considered under complete, Type-I censoring and Type-II censoring. Maximum likelihood estimators as well as approximate confidence intervals of the population parameters are discussed. A comprehensive simulation study is done to assess the performance of estimates based on their biases and mean square errors. The notability of the generalized Marshall-Olkin inverse Lindley model is clarified by means of two real data sets. The results showed the fact that the generalized Marshall-Olkin inverse Lindley model can produce better fits than power Lindley, extended Lindley, alpha power transmuted Lindley, alpha power extended exponential and Lindley distributions.  相似文献   

3.
This paper presents a model to estimate the lifetime of degrading infrastructure systems subject to shocks based on the family of Phase-type (PH) distributions. In particular, the paper focuses on damage accumulation when both the inter-arrival time of shocks and their sizes are random. PH distributions are applied to approximate any probability distribution with positive support; furthermore, their matrix-geometric properties allow to handle problems involving the calculation of convolutions (e.g., sum of shock sizes). The proposed PH shock model relaxes the identically distributed assumption for the inter-arrival times and/or shock sizes. Besides, the model provides easy-to-evaluate expressions for important reliability quantities such as the density function and the moments of the lifetime, and the mean and moments of the cumulative shock deterioration at any time. In order to fit data or theoretical distributions to PH, the paper compares and discusses two PH fitting algorithms: the Moment Matching (MM) and the Expectation Maximization (EM) methods in terms of accuracy, computational efficiency and the available information of the random variables to fit. Then, it provides an algorithm for the reliability estimation of infrastructures along with a study of its accuracy and efficiency; the results show acceptable execution times for most practical applications. Finally, the use of PH to handle degradation is illustrated with several examples of engineering interest; i.e., deterioration due to crack growth, corrosion, aftershocks sequences, among others.  相似文献   

4.
最大信息熵方法是基于概率分布评定测量不确度的主要方法之一。其所依赖的高阶矩需要较大样本的测量数据,而校准/检测实验室的测量一般为小样本,故用最大熵方法评定小样本测量不确定度缺乏一定的可靠性。提出了基于分位数函数和概率权重矩作为约束条件的最大信息熵不确定度评定法,把矩的计算从高次降为一次,并结合遗传算法求解概率分布,用Bootstrap分布估计扩展不确定度和包含区间,解决了由分位数区间估计分布不对称所致的复杂计算问题。  相似文献   

5.
基于Legendre正交多项式逼近法的结构可靠性分析   总被引:2,自引:0,他引:2  
提出了结构可靠性分析的Legendre正交多项式逼近法。主要是基于数值逼近原理,以Legendre正交函数族做基,利用功能函数的高阶矩信息,通过计算功能函数概率密度函数的逼近表达式,然后根据工程结构可靠性的一般表达式来计算结构的失效概率,进行可靠性分析。通过数值检验,证明该方法可以很好地逼近各种经典理论分布曲线(正态分布、指数分布等6种经典分布)。文后给出了结构构件失效概率的实例计算,并和其它几种常用方法进行对比,进一步表明了Legendre正交多项式数值逼近法在结构可靠性分析理论上的正确性和实用性。  相似文献   

6.
The formulation of a probability‐stress‐life (P‐S‐N) curve is a necessary step beyond the basic S‐N relation when dealing with reliability. This paper presents a model, relevant to materials that exhibits a fatigue limit, which considers the number of cycles to failure and the occurrence of the failure itself as statistically independent events, described with different distributions and/or different degree of scatter. Combining these two as a parallel system leads to the proposed model. In the case where the S‐N relation is a Basquin's law, the formulations of the probability density function, cumulative distribution function, quantiles, parameter and quantile confidence interval are presented in a procedure that allows practically any testing strategy. The result is a flexible model combined with the tools that deliver a wide range of information needed in the design phase. Finally, an extension to include static strength and applicability to fatigue crack growth and defects‐based fatigue approach are presented.  相似文献   

7.
A flexible Weibull extension   总被引:2,自引:1,他引:2  
We propose a new two-parameter ageing distribution which is a generalization of the Weibull and study its properties. It has a simple failure rate (hazard rate) function. With appropriate choice of parameter values, it is able to model various ageing classes of life distributions including IFR, IFRA and modified bathtub (MBT). The ranges of the two parameters are clearly demarcated to separate these classes. It thus provides an alternative to many existing life distributions. Details of parameter estimation are provided through a Weibull-type probability plot and maximum likelihood. We also derive explicit formulas for the turning points of the failure rate function in terms of its parameters. This, combined with the parameter estimation procedures, will allow empirical estimation of the turning points for real data sets, which provides useful information for reliability policies.  相似文献   

8.
This paper develops a methodology to integrate reliability testing and computational reliability analysis for product development. The presence of information uncertainty such as statistical uncertainty and modeling error is incorporated. The integration of testing and computation leads to a more cost-efficient estimation of failure probability and life distribution than the tests-only approach currently followed by the industry. A Bayesian procedure is proposed to quantify the modeling uncertainty using random parameters, including the uncertainty in mechanical and statistical model selection and the uncertainty in distribution parameters. An adaptive method is developed to determine the number of tests needed to achieve a desired confidence level in the reliability estimates, by combining prior computational prediction and test data. Two kinds of tests — failure probability estimation and life estimation — are considered. The prior distribution and confidence interval of failure probability in both cases are estimated using computational reliability methods, and are updated using the results of tests performed during the product development phase.  相似文献   

9.
The actuaries always look for heavy-tailed distributions to model data relevant to business and actuarial risk issues. In this article, we introduce a new class of heavy-tailed distributions useful for modeling data in financial sciences. A specific sub-model form of our suggested family, named as a new extended heavy-tailed Weibull distribution is examined in detail. Some basic characterizations, including quantile function and raw moments have been derived. The estimates of the unknown parameters of the new model are obtained via the maximum likelihood estimation method. To judge the performance of the maximum likelihood estimators, a simulation analysis is performed in detail. Furthermore, some important actuarial measures such as value at risk and tail value at risk are also computed. A simulation study based on these actuarial measures is conducted to exhibit empirically that the proposed model is heavy-tailed. The usefulness of the proposed family is illustrated by means of an application to a heavy-tailed insurance loss data set. The practical application shows that the proposed model is more flexible and efficient than the other six competing models including (i) the two-parameter models Weibull, Lomax and Burr-XII distributions (ii) the three-parameter distributions Marshall-Olkin Weibull and exponentiated Weibull distributions, and (iii) a well-known four-parameter Kumaraswamy Weibull distribution.  相似文献   

10.
An overwhelming majority of publications on Nonhomogeneous Poisson Process (NHPP) considers just two monotonic forms of the NHPP's rate of occurrence of failures (ROCOF): the log-linear model the power law model. In this paper, we propose to capitalize on the fact that NHPP's ROCOF formally coincides with the hazard function of the underlying lifetime distribution. Therefore, the variety of parametric forms for the hazard functions of traditional lifetime distributions (lognormal, Gumbel, etc.) could be used as the models for the ROCOF of respective NHPPs. Moreover, the hazard function of a mixture of underlying distributions could be used to model the non-monotonic ROCOF. Parameter estimation of such ROCOF models reduces to the estimation of the cumulative hazard function of the underlying lifetime distribution. We use real-world automotive data to illustrate the point.  相似文献   

11.
The lifetime distributions with bathtub-shaped hazard rate functions and censoring scheme have been used widely in life testing and reliability engineering. This paper develops a new approach for estimating parameters of an important two-parameter lifetime data analysis model with bathtub-shaped hazard rate function under the assumption that sample is modified progressively hybrid censored. One of the most frequently used methodologies, maximum likelihood (ML) estimation, is used for estimating unknown parameters. The estimates of unknown parameters are proposed using popular Newton–Raphson algorithm because the estimators cannot be obtained in closed forms. It is well known that the convergence of Newton–Raphson algorithm is affected by an initial point. Therefore, a new Newton–Raphson algorithm with an adaptive initial point within the exact joint confidence region has been suggested to compute the ML estimation. Extensive numerical simulations show that the proposed algorithm converges all the times and it is effective. Finally, one real-world data set from engineering is analysed to illustrate the application of the proposed  method.  相似文献   

12.
The principle of minimum cross-entropy provides a systematic approach to derive the posterior distribution of a random variable given a prior and additional information in terms of its product moments. This approach can be extended to derive directly the quantile function by using probability weighted moments (PWMs) as constraints in the cross-entropy minimization approach, as shown in a previous study [Pandey MD. Extreme quantile estimation using order statistics with minimum cross-entropy principle. Probabilistic Engineering Mechanics 2001;16(1):31–42]. The objective of the present paper is to extend and improve the previous method by incorporating the use of the fractional probability weighted moments (FPWMs) in the place of conventional integer-order PWMs. A new and general estimation method is proposed in which the Monte Carlo simulations and optimization algorithms are combined to estimate FPWMs that would subsequently lead to the best-fit quantile function. The numerical examples presented in the paper show a substantial improvement in accuracy by the use of the proposed method over the conventional approach.  相似文献   

13.
A new three‐parameter probability distribution called the omega probability distribution is introduced, and its connection with the Weibull distribution is discussed. We show that the asymptotic omega distribution is just the Weibull distribution and point out that the mathematical properties of the novel distribution allow us to model bathtub‐shaped hazard functions in two ways. On the one hand, we demonstrate that the curve of the omega hazard function with special parameter settings is bathtub shaped and so it can be utilized to describe a complete bathtub‐shaped hazard curve. On the other hand, the omega probability distribution can be applied in the same way as the Weibull probability distribution to model each phase of a bathtub‐shaped hazard function. Here, we also propose two approaches for practical statistical estimation of distribution parameters. From a practical perspective, there are two notable properties of the novel distribution, namely, its simplicity and flexibility. Also, both the cumulative distribution function and the hazard function are composed of power functions, which on the basis of the results from analyses of real failure data, can be applied quite effectively in modeling bathtub‐shaped hazard curves.  相似文献   

14.
The Accelerated Life Testing (ALT) has been used for a long time in several fields to obtain information on the reliability of product components and materials under operating conditions in a much shorter time. One of the main purposes of applying ALT is to estimate the failure time functions and reliability performance under normal conditions. This paper concentrates on the estimation procedures under ALT and how to select the best estimation method that gives accurate estimates for the reliability function. For this purpose, different estimation methods are used, such as maximum likelihood, least squares (LS), weighted LS, and probability weighted moment. Moreover, the reliability function under usual conditions is predicted. The estimation procedures are applied under the family of the exponentiated distributions in general, and for the exponentiated inverted Weibull (EIW) as a special case. Numerical analysis including simulated data and a real life data set is conducted to compare the performances between these four methods. It is found that the ML method gives the best results among other estimation methods. Finally, a comparison between the EIW and the Inverted Weibull (IW) distributions based on a real life data set is made using a likelihood ratio test. It is observed that the EIW distribution can provide better fitting than the IW in case of ALT. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

15.
A numerical method was developed for estimating the shapes of unknown distributions of analytical data and for estimating the expected values of censored data points. The method is based conceptually on the normal probability plot. Data are ordered and then transformed by using a power function to achieve approximate linearity with respect to a computed normal cumulative probability scale. The exponent used in the power transformation is an index of the distribution shape, which covers a continuum on which normality is defined as d = 1 and log normality is defined as d = 0. Expected transformed values of censored points are computed from a straight line fitted to the transformed, accepted data, and these are then back-transformed to the original distribution. The method gives improved characterization of analytical data distributions, particularly in the distribution extremities. It also avoids the biases from improper handling of censored data arising from measurements near the analytical detection limit. Illustrative applications were computed for atmospheric SO2 data and for mineral concentrations in hamburgers.  相似文献   

16.
Quantile regression, as a generalization of median regression, has been widely used in statistical modeling. To allow for analyzing complex data situations, several flexible regression models have been introduced. Among these are the varying coefficient models, that differ from a classical linear regression model by the fact that the regression coefficients are no longer constant but functions that vary with the value taken by another variable, such as for example, time. In this paper, we study quantile regression in varying coefficient models for longitudinal data. The quantile function is modeled as a function of the covariates and the main task is to estimate the unknown regression coefficient functions. We approximate each coefficient function by means of P-splines. Theoretical properties of the estimators, such as rate of convergence and an asymptotic distribution are established. The estimation methodology requests solving an optimization problem that also involves a smoothing parameter. For a special case the optimization problem can be transformed into a linear programming problem for which then a Frisch–Newton interior point method is used, leading to a computationally fast and efficient procedure. Several data-driven choices of the smoothing parameters are briefly discussed, and their performances are illustrated in a simulation study. Some real data analysis demonstrates the use of the developed method.  相似文献   

17.
The results of this paper show that neural networks could be a very promising tool for reliability data analysis. Identifying the underlying distribution of a set of failure data and estimating its distribution parameters are necessary in reliability engineering studies. In general, either a chi-square or a non-parametric goodness-of-fit test is used in the distribution identification process which includes the pattern interpretation of the failure data histograms. However, those procedures can guarantee neither an accurate distribution identification nor a robust parameter estimation when small data samples are available. Basically, the graphical approach of distribution fitting is a pattern recognition problem and parameter estimation is a classification problem where neural networks have been proved to be a suitable tool. This paper presents an exploratory study of a neural network approach, validated by simulated experiments, for analysing small-sample reliability data. A counter-propagation network is used in classifying normal, uniform, exponential and Weibull distributions. A back-propagation network is used in the parameter estimation of a two-parameter Weibull distribution.  相似文献   

18.
An efficient strategy to approximate the failure probability function in structural reliability problems is proposed. The failure probability function (FPF) is defined as the failure probability of the structure expressed as a function of the design parameters, which in this study are considered to be distribution parameters of random variables representing uncertain model quantities. The task of determining the FPF is commonly numerically demanding since repeated reliability analyses are required. The proposed strategy is based on the concept of augmented reliability analysis, which only requires a single run of a simulation-based reliability method. This paper introduces a new sample regeneration algorithm that allows to generate the required failure samples of design parameters without any additional evaluation of the structural response. In this way, efficiency is further improved while ensuring high accuracy in the estimation of the FPF. To illustrate the efficiency and effectiveness of the method, case studies involving a turbine disk and an aircraft inner flap are included in this study.  相似文献   

19.
The paper presents a general approach to the estimation of the quantile function of a non-negative random variable using the principle of minimum cross-entropy (CrossEnt) subject to constraints specified in terms of expectations of order statistics estimated from observed data.Traditionally CrossEnt is used for estimating the probability density function under specified moment constraints. In such analyses, consideration of higher order moments is important for accurate modelling of the distribution tail. Since the higher order (>2) moment estimates from a small sample of data tend to be highly biased and uncertain, the use of CrossEnt quantile estimates in extreme value analysis is fairly limited.The present paper is an attempt to overcome this problem via the use of probability weighted moments (PWMs), which are essentially the expectations of order statistics. In contrast with ordinary statistical moments, higher order PWMs can be accurately estimated from small samples. By interpreting a PWM as the moment of quantile function, the paper derives an analytical form of quantile function using the CrossEnt principle. Monte Carlo simulations are performed to assess the accuracy of CrossEnt quantile estimates obtained from small samples.  相似文献   

20.
降雨作为市政工程和土木工程领域重要的灾害荷载之一,其概率结构对相应随机系统的分析有着至关重要的作用。然而,常用的概率模型往往不能很好地描述降雨的概率结构。为此,针对线性虚拟随机过程的广义密度演化方程及其形式解析解,导出了可直接用于随机静力系统分析的概率密度变换解,并发展了δ序列逼近算法。若将随机数据视为自映射系统,则上述方法可方便用于随机数据的概率结构分析。采用上述方法,该文分析了重庆市降雨的概率结构,获得了最大日降雨量概率密度函数的近似解,并由等间距的频数直方图和等频数直方图以及经验累积分布函数验证了近似解的准确性、可信性。遗憾的是,真实的概率密度函数复杂而不便于实用。因此,该文建议了一种描述复杂概率结构的线性组合模型,并通过试算法给出了最大日降雨量的实用概率模型,为后续的随机系统分析奠定基础。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号