首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Particulate matter is collected on some sampling medium. The particles of interest are collected by some means, and weighed as a group. Weights are observed for a number of replicate samples. These observed weights may include background contribution from particles existing on the sampling medium prior to its use. Given these data, plus similar date to establish the background, the problem is to estimate the average number of particles per sample, and their weight distribution. The estimation is accomplished by equating sample moments to population moments. The first four moments of the population are found for an arbitrary weight distribution function possessing finite first four moments. Specific estimates are found in the event the weight distribution is exponential in form, and approximate sampling variances of these estimates are derived. A numerical example is included.  相似文献   

2.
Much research effort has recently been focused on methods to deal with non‐normal populations. While for weak non‐normality the normal approximation is a useful choice (as in Shewhart control charts), moderate to strong skewness requires alternative approaches. In this short communication, we discuss the properties required from such approaches, and revisit two new ones. The first approach, for attributes data, assumes that the mean, the variance and the skewness measure can be calculated. These are then incorporated in a modified normal approximation, which preserves these moments. Extension of the Shewhart chart to skewed attribute distributions (e.g. the geometric distribution) is thus achieved. The other approach, for variables data, fit a member of a four‐parameter family of distributions. However, unlike similar approaches, sample estimates of at most the second degree are employed in the fitting procedure. This has been shown to result in a better representation of the underlying (unknown) distribution than methods based on four‐moment matching. Some numerical comparisons are given. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

3.
This paper proposes an analytical solution for fast tolerance analysis of the assembly of components with a mean shift or drift in the form of a doubly-truncated normal distribution. The assembly of components with a mean shift or drift in the form of a uniform distribution (the Gladman model) can be calculated by this method as well since the uniform distribution is a special form of the doubly-truncated normal distribution. Integration formulae of the first four moments of the truncated normal distribution are first derived. The first four moments of the resultant tolerance distribution are then calculated. As a result, the resultant tolerance specification is represented as a function of the standard deviation and the coefficient of kurtosis of the resultant distribution. Based on this method, the calculated resultant tolerance specification is more accurate than that predicted by the Gladman's model or the simplified truncated normal model. The difference between this model and the Monte Carlo method with 1,000,000 simulation samples is less than 0.5%. The merit of the proposed method is that it is fast and accurate which is crucial for engineering applications in tolerance analysis.  相似文献   

4.
It is known that the probability distribution satisfy the Maximum Entropy Principle (MEP) if the available data consist in four moments of probability density function. Two problems are typically associated with use of MEP: the definition of the range of acceptable values for the moments Mi; the evaluation of the coefficients aj. Both problems have already been accurately resolved by analytical procedures when the first two moments of the distribution are known.

In this work, the analytical solution in the case of four known moments is provided and a criterion for confronting the general case (whatever the number of known moments) is expounded. The first four moments are expressed in nondimensional form through the expectation and the coefficients of variation, skewness and kurtosis. The range of their acceptable values is obtained from the analytical properties of the differential equations which govern the problem and from the Schwarz inequality.  相似文献   


5.
Despite many advances in the field of computational system reliability analysis, estimating the joint probability distribution of correlated non-normal state variables on the basis of incomplete statistical data brings great challenges for engineers. To avoid multidimensional integration, system reliability estimation usually requires the calculation of marginal failure probability and joint failure probability. The current article proposed an integrated approach for estimating system reliability on the basis of the high moment method, saddle point approximation, and copulas. First, the statistic moment estimation based on the stochastic perturbation theory is presented. Thereafter, by constructing CGF (concise cumulant generating function) for the state variable with its first four statistical moments, a fourth moment saddle point approximation method is established for the component reliability estimation. Second, the copula theory is briefly introduced and extensively utilized two-dimensional copulas are presented. The best fit copula for estimating the probability of system failure is selected according to the AIC (Akaike Information Criterion). Finally, the derived method is applied to three numerical examples for the sake of a comprehensive validation.  相似文献   

6.
The process capability index (PCI) is a quality control–related statistic mostly used in the manufacturing industry, which is used to assess the capability of some monitored process. It is of great significance to quality control engineers as it quantifies the relation between the actual performance of the process and the preset specifications of the product. Most of the traditional PCIs performed well when process follows the normal behaviour. However, using these traditional indices to evaluate a non‐normally distributed process often leads to inaccurate results. In this article, we consider a new PCI, Cpy, suggested by Maiti et al, which can be used for normal as well as non‐normal random variables. This article addresses the different methods of estimation of the PCI Cpy from both frequentist and Bayesian view points of generalized Lindley distribution suggested by Nadarajah et al. We briefly describe different frequentist approaches, namely, maximum likelihood estimators, least square and weighted least square estimators, and maximum product of spacings estimators. Next, we consider Bayes estimation under squared error loss function using gamma priors for both shape and scale parameters for the considered model. We use Tierney and Kadane's method as well as Markov Chain Monte Carlo procedure to compute approximate Bayes estimates. Besides, two parametric bootstrap confidence intervals using frequentist approaches are provided to compare with highest posterior density credible intervals. Furthermore, Monte Carlo simulation study has been carried out to compare the performances of the classical and the Bayes estimates of Cpy in terms of mean squared errors along with the average width and coverage probabilities. Finally, two real data sets have been analysed for illustrative purposes.  相似文献   

7.
In this paper the use of fractional moments for estimation purposes is discussed. These ideas are illustrated by means of the mixed exponential distribution.

The estimation of the three parameters of the above distribution by the method of moments and by maximum likelihood is investigated numerically in detail. As anticipated, the efficiency of the former method can be greatly increased by using approximately optimal combinations of moments. It is found that the moment method requires only a small amount of calculation when compared with the maximum likelihood method, although charts are presented to greatly ease the computational burden of the latter method.  相似文献   

8.
Based on the extended Huygens–Fresnel integral, second-order moments of the Wigner distribution function of a partially coherent radially polarized beam propagating through atmospheric turbulence are derived. Besides, propagation properties such as the mean-squared beam width, angular width, effective radius of curvature, beam propagation factor and Rayleigh range can also be obtained and calculated numerically. It is shown that the propagation properties are dependent on the spatial correlation length, refraction index structure constant and propagation distance.  相似文献   

9.
最大信息熵方法是基于概率分布评定测量不确度的主要方法之一。其所依赖的高阶矩需要较大样本的测量数据,而校准/检测实验室的测量一般为小样本,故用最大熵方法评定小样本测量不确定度缺乏一定的可靠性。提出了基于分位数函数和概率权重矩作为约束条件的最大信息熵不确定度评定法,把矩的计算从高次降为一次,并结合遗传算法求解概率分布,用Bootstrap分布估计扩展不确定度和包含区间,解决了由分位数区间估计分布不对称所致的复杂计算问题。  相似文献   

10.
This paper is concerned primarily with the method of moments in dissecting a mixture of two normal distributions. In the general case, with two means, two standard deviations, and a proportionality factor to be estimated, the first five sample moments are required, and it becomes necessary to find a particular solution of a ninth degree polynomial equation that was originally derived by Karl Pearson [10]. A procedure which circumvents solution of the nonic equation and thereby considerably reduces the total computational effort otherwise required, is presented. Estimates obtained in the simpler special case in which the two standard deviations are assumed to be equal, are employed as first approximations in an iterative method for simultaneously solving the basic system of moment equations applicable in the more general case in which the two standard deviations are unequal. Conditional maximum likelihood and conditional minimum chi-square estimation subject to having the first four sample moments equated to corresponding population moments, are also considered. An illustrative example is included.  相似文献   

11.
The improvement of mechanical parts inherent reliability has an impact on the reputation and performance of the company. To estimate the inherent reliability of products more conveniently and economically, a hidden quality cost-production cost (HQC-PC) reliability prediction model is put forward. To estimate the hidden quality cost (HQC) of products more accurately, a quadratic exponential quality loss function model is established, which is different from Taguchi's quadratic quality loss function (QLF) and the modified QLFs. In the new quality loss model, the growth rate of quality loss on both sides of the target value is considered. Under the condition that the quality characteristic value obeys normal distribution, the general estimation formulas of HQC in the tolerance range is obtained considering sampling error and the numerical model of inherent reliability is established. The effect of different parameters on the inherent reliability of products is discussed with practical case, such as design and production parameters. Then, the appropriate process capability index (PCI) is selected according to different production processes. The relationship between the HQC-PC reliability prediction model and PCI is derived by a numerical model of inherent reliability. A new analysis method of inherent reliability is proposed.  相似文献   

12.
陈凌峰 《计量学报》2019,40(2):347-352
JJF 1059.1-2012《测量不确定度评定与表示》与GUM的区别之一是在标准不确定度的A类评定中引入了极差法。假设总体分别服从正态分布和均匀分布,则总体标准差的极差估计量,以及用于实际计算的极差系数可以从样本极差的分布函数导出。理论分析表明:虽然用极差法估计的总体标准差是无偏的,但是估计的总体方差偏大,这将导致最终测量结果的合成标准不确定度偏大。同时JJF 1059.1中仅提供了总体接近正态分布时的极差系数,并不适用于所有情况。作为比较,不论总体分布如何,使用贝塞尔公式估计的总体方差总是无偏的,不会给测量结果的合成标准不确定度带来原理性误差。由于极差法存在概率统计学上的原理性误差以及适用性限制,建议在标准不确定度A类评定中应审慎使用极差法。  相似文献   

13.
The maximum entropy principle constrained by probability weighted moments is an useful technique for unbiasedly and efficiently estimating the quantile function of a random variable from a sample of complete observations. However, censored or incomplete data are often encountered in engineering reliability and lifetime distribution analysis. This paper presents a new distribution free method for the estimation of the quantile function of a non-negative random variable using a censored sample of data, which is based on the principle of partial maximum entropy (MaxEnt) in which partial probability weighted moments (PPWMs) are used as constraints. Numerical results and practical examples presented in the paper confirm the accuracy and efficiency of the proposed partial MaxEnt quantile function estimation method for censored samples.  相似文献   

14.
Bivariate Weibull distribution can address the life of a system exhibiting 2‐dimensional characteristics in risk and reliability engineering. The applicability of bivariate Weibull distribution has been hindered by its difficulty with parameter estimation, as the number of parameters in bivariate Weibull distribution is more than those in univariate Weibull distribution. Considering a particular structure of a bivariate Weibull distribution model, this paper proposes a generalized moment method (GMM) for parameter estimation. This GMM method is simple, and it has proved to be efficient. The GMM can guarantee the existence and the uniqueness of the solution. A confidence interval for each estimator is derived from the moments of the bivariate distribution. The paper presents a simulation case and 2 real cases to demonstrate the proposed methods.  相似文献   

15.
In some statistical process control (SPC) applications, it is assumed that a quality characteristic or a vector of quality characteristics of interest follows a univariate or multivariate normal distribution, respectively. However, in certain applications this assumption may fail to hold and could lead to misleading results. In this paper, we study the effect of non‐normality when the quality of a process or product is characterized by a linear profile. Skewed and heavy‐tailed symmetric non‐normal distributions are used to evaluate the non‐normality effect numerically. The results reveal that the method proposed by Kimtextitet al. (J. Qual. Technol. 2003; 35 :317–328) can be designed to be robust to non‐normality for both highly skewed and heavy‐tailed distributions. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

16.
In this paper, a new approach for the evaluation of the probability density function (pdf) of a random variable from the knowledge of its lower moments is presented. At first the classical moment problem (MP) is revisited, which gives the conditions such that the assigned sequence of sample moments represent really a sequence of moments of any distribution. Then an alternative approach is presented, termed as the kernel density maximum entropy (MaxEnt) method by the authors, which approximates the target pdf as a convex linear combination of kernel densities, transforming the original MP into a discrete MP, which is solved through a MaxEnt approach. In this way, simply solving a discrete MaxEnt problem, not requiring the evaluation of numerical integrals, an approximating pdf converging toward the MaxEnt pdf is obtained. The method is first demonstrated by approximating some known analytical pdfs (the chi‐square and the Gumbel pdfs) and then it is applied to some experimental engineering problems, namely for modelling the pdf of concrete strength, the circular frequency and the damping ratio of strong ground motions, the extreme wind speed in Messina's Strait region. All the developed numerical applications show the goodness and efficacy of the proposed procedure. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

17.
单自由度非线性随机参数系统的可靠性分析   总被引:2,自引:0,他引:2  
阐述了具有随机参数的单自由度非线性振动系统的可靠性方法,使用四阶矩技术确定了系统响应和状态函数的前四阶矩,应用Edgworth级数把未知响应和状态函数的概率分布展开成标准正态分布的表达式,从而获得了系统的可靠度。  相似文献   

18.
The beta exponential distribution   总被引:1,自引:0,他引:1  
The exponential distribution is perhaps the most widely applied statistical distribution for problems in reliability. In this note, we introduce a generalization—referred to as the beta exponential distribution—generated from the logit of a beta random variable. We provide a comprehensive treatment of the mathematical properties of the beta exponential distribution. We derive expressions for the moment generating function, characteristic function, the first four moments, variance, skewness, kurtosis, mean deviation about the mean, mean deviation about the median, Rényi entropy, Shannon entropy, the distribution of sums and ratios, and the asymptotic distribution of the extreme order statistics. We also discuss simulation issues, estimation by the methods of moments and maximum likelihood and provide an expression for the Fisher information matrix. We hope that this generalization will attract wider applicability in reliability.  相似文献   

19.
Robust parameter design (RPD) and tolerance design (TD) are two important stages in design process for quality improvement. Simultaneous optimization of RPD and TD is well established on the basis of linear models with constant variance assumption. However, little attention has been paid to RPD and TD with non‐constant variance of residuals or non‐normal responses. In order to obtain further quality improvement and cost reduction, a hybrid approach for simultaneous optimization of RPD and TD with non‐constant variance or non‐normal responses is proposed from generalized linear models (GLMs). First, the mathematical relationship among the process mean, process variance and control factors, noise factors and tolerances is derived from a dual‐response approach based on GLMs, and the quality loss function integrating with tolerance is developed. Second, the total cost model for RPD‐TD concurrent optimization based on GLMs is proposed to determine the best control factors settings and the optimal tolerance values synchronously, which is solved by genetic algorithm in detail. Finally, the proposed approach is applied into an example of electronic circuit design with non‐constant variance, and the results show that the proposed approach performs better on quality improvement and cost reduction. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

20.
建筑围护结构抗风设计需要准确估计非高斯风压极值或者峰值因子。对于非高斯风压峰值因子估计,常用的基于矩的转换过程法有Hermite多项式模型(HPM)、Johnson转换模型(JTM)及平移广义对数正态分布(SGLD)模型。极值通常由母本概率密度函数(PDF)的尾部决定,现阶段对于三种模型基于相同前四阶矩预测的非高斯母本PDF尾部的差别尚不清楚,自然,对于这三种模型预测的极值或者峰值因子的差别尚无答案。为了探明三种模型的异同,从而提供一定的选取原则,该文就三种方法对非高斯风压峰值因子估计效果进行了系统的对比研究。首先从理论上对比了三种方法预测得到的母本PDF的差异和估计的峰值因子差别;其次,选用长时距风洞试验风压数据检验了三种方法对非高斯风压峰值因子的估计效果。结果表明在三种模型都适用的偏度和峰度组合范围内,HPM对非高斯风压峰值因子估计结果相比SGLD模型和JTM模型估计结果更准确。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号