首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Knowing the time of changes in mean and variance in a process is crucial for engineers to identify the special cause quickly and correctly. Because assignable causes may give rise to changes in mean and variance at the same time, monitoring the mean and variance simultaneously is required. In this paper, a mixture likelihood approach is proposed to detect shifts in mean and variance simultaneously in a normal process. We first transfer the change point model formulation into a mixture model and then employ the expectation and maximization algorithm to estimate the time of shifts in mean and variance simultaneously. The proposed method called EMCP (expectation and maximization change point) can be used in both phase I and II applications without the knowledge of in‐control process parameters. Moreover, EMCP can detect the time of multiple shifts and simultaneously produce the estimates of shifts in each individual segment. Numerical data and real datasets are employed to compare EMCP with the direct statistical maximum likelihood method without the use of mixture models. The experimental results show the superiority and effectiveness of the proposed EMCP. The outperformance of EMCP in detecting the time of small shifts is particularly important and beneficial for engineers to identify assignable causes rapidly and accurately in phase II applications in which small shifts occur more often and hence lead to a large average run length. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

2.
Reliability approximation using finite Weibull mixture distributions   总被引:10,自引:3,他引:10  
The shape of measured or design life distributions of systems can vary considerably, and therefore frequently cannot be approximated by simple distribution functions. The scope of the paper is to prove that the reliability of an arbitrary system can be approximated well by a finite Weibull mixture with positive component weights only, without knowing the structure of the system, on condition that the unknown parameters of the mixture can be estimated. To support the main idea, five examples are presented. In order to estimate the unknown component parameters and the component weights of the Weibull mixture, some of the already existing methods are applied and the EM algorithm for the m-fold Weibull mixture is derived. The fitted distributions obtained by different methods are compared to the empirical ones by calculating the AIC and δC values. It can be concluded that the suggested Weibull mixture with an arbitrary but finite number of components is suitable for lifetime data approximation. For parameter estimation the combination of the alternative and EM algorithm is suggested.  相似文献   

3.
Traditional shape profile monitoring of product geometric features mostly focuses on one type or mode of shapes in the discrete‐part manufacturing. Little attention has been paid to monitoring of multimode shape profiles, where different modes of shapes appear in a sample in the batch production process. Motivated by a real example of a powder material production process, we exploit the statistical process monitoring of multimode near‐circular shape profiles. First, we develop a feature extraction approach that is invariant to shape rotation and thus requires no registration for a mixture of different modes of shape profiles. The extracted feature vectors capture shape features well, based on which different modes of shape profiles are separated into several clusters. This enables us to build a Gaussian mixture model for the multimodality in the feature vector space. In process surveillance, a control chart is constructed based on the likelihood ratio test for detecting shifts in both the proportions and the shape features of multimode near‐circular shape profiles. Numerical simulations and real case studies demonstrate the effectiveness of our proposed chart.  相似文献   

4.
G. S. Lingappaiah 《TEST》1984,35(3):319-330
Summary Discrete analogue of the Liouville distribution is defined and is termed as Discrete generalized Liouville-Type distribution (DGL-TD) Firstly, properties in its factorial and ordinary momen’s are given. Then by finding the covariance matrix, partial and multiple correlations for DGL-TD are evaluated. Multinomial, multivariate negative binomal and multivariate log series distributions are shown as particular cases of this general distribution. The asymptotic distribution of the estimates of the parameters is also attempted.  相似文献   

5.
Though the analysis of variance is a commonly applied method for testing for differences between means of several processes, it is based in part on the assumption that the processes give rise to output that is normally distributed on the measured variable. Reliability and life test studies frequently give birth to data that exhibit clear skew, and application of the analysis of variance is questionable in such cases. A method referred to as analysis of reciprocals, which is based on an assumed inverse Gaussian distribution, provides an alternative to the analysis of variance in these instances. With applications in a variety of functional areas, including reliability and life testing, the inverse Gaussian distribution is able to accommodate substantial skew. It is hoped that this exposition will increase awareness of both the inverse Gaussian distribution and data analysis methods that are based on this distribution.  相似文献   

6.
Gilles R. Ducharme 《TEST》2001,10(2):271-290
In this paper, tests of goodness-of-fit for the inverse Gaussian distribution are developed. The distribution involves a shape parameter and, because of this, some test approaches lead to inconsistent strategies. A consistent test is proposed and its properties investigated. A table of critical points is provided and both the level and the power of the test are explored by simulation. It is seen that the test is more powerful than most of its competitors. The framework is widened to cover satellite distributions of the inverse Gaussian and some types of censored data. An example concludes the paper.  相似文献   

7.
The adequacy of the model originally developed by Charbon and Rappaz for nucleation temperatures of grains to the formation of hydrogen pores during the solidification of aluminium alloys has been investigated. By using four datasets from the literature, it has been found that the Gaussian distribution assumed in the original model for nucleation temperature has provided poor fits to all datasets with systematic error. The hypothesis that undercooling follows the lognormal distribution has been tested. In all four cases, the hypotheses that undercooling is lognormal could not be rejected.  相似文献   

8.
Lifetime data collected from reliability tests are among data that often exhibit significant heterogeneity caused by variations in manufacturing, which makes standard lifetime models inadequate. Finite mixture models provide more flexibility for modeling such data. In this paper, the Weibull-log-logistic mixture distributions model is introduced as a new class of flexible models for heterogeneous lifetime data. Some statistical properties of the model are presented including the failure rate function, moments generating function, and characteristic function. The identifiability property of the class of all finite mixtures of Weibull-log-logistic distributions is proved. The maximum likelihood estimation (MLE) of model parameters under the Type I and Type II censoring schemes is derived. Some numerical illustrations are performed to study the behavior of the obtained estimators. The model is applied to the hard drive failure data made by the Backblaze data center, where it is found that the proposed model provides more flexibility than the univariate life distributions (Weibull, Exponential, logistic, log-logistic, Frechet). The failure rate of hard disk drives (HDDs) is obtained based on MLE estimates. The analysis of the failure rate function on the basis of SMART attributes shows that the failure of HDDs can have different causes and mechanisms.  相似文献   

9.
This paper puts forward a two‐dimensional probability distribution. The first ‘wing’ of this distribution is the device lifetime. The second ‘wing’ is the most promising reliability indicator, namely the 1/f noise factor. The model is intended to serve as a basis for reliability screening. It involves the noise–reliability correlation coefficient and has an attractive engineering interpretation. It ascribes a two‐element series reliability structure to the device. The first element of this structure is noise‐independent and ‘real’ in the sense that its hazard rate is always of positive value. The second element is noise‐dependent. The element is imaginary in the sense that its hazard rate can be changed from positive to negative value as the noise–reliability correlation coefficient changes in the same direction. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

10.
Progressive censoring technique is useful in lifetime data analysis. Simple approaches to progressive data analysis are crucial for its widespread adoption by reliability engineers. This study develops an efficient yet easy‐to‐implement framework for analyzing progressively censored data by making use of the stochastic EM algorithm. On the basis of this framework, we develop specific stochastic EM procedures for several popular lifetime models. These procedures are shown to be very simple. We then demonstrate the applicability and efficiency of the stochastic EM algorithm by a fatigue life data set with proper modification and by a progressively censored data set from a life test on hard disk drives. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

11.
混合高斯概率密度模型可以很好地拟合非高斯样本的概率密度。在各高斯分量概率密度互不重叠的条件下,使用动态簇算法可以快速而精确地估计出混合高斯概率密度模型参数。这是一种基于最小均方差原则的递推算法,在正向推导出各种可能的簇边界后,再根据确定的最末边界值逆向推定各前导簇边界,从而得到混合高斯概率密度模型参数估计值。描述模型及参数估计问题之后,动态簇算法被推导出来。然后深入探讨了该算法的实质及适用条件。最后结合数值仿真实例,分析了动态簇算法的估计性能。  相似文献   

12.
In the evaluation of structural reliability, a failure is defined as the event in which stress exceeds a resistance that is liable to deterioration. This paper presents a method to combine the two stochastic processes of deteriorating resistance and fluctuating load for computing the time-dependent reliability of a structural component. The deterioration process is modelled as a gamma process, which is a stochastic process with independent non-negative increments having a gamma distribution with identical scale parameter. The stochastic process of loads is generated by a Poisson process. The variability of the random loads is modelled by a peaks-over-threshold distribution (such as the generalised Pareto distribution). These stochastic processes of deterioration and load are combined to evaluate the time-dependent reliability.  相似文献   

13.
An alternative perspective on the mixture estimation problem   总被引:1,自引:0,他引:1  
The paper presents an alternative perspective on the mixture estimation problem. First, observations are counted into a histogram. Secondly, rough and enhanced parameter estimation followed by the separation of observations is done. Finally, the residue is distributed between the components by the Bayes decision rule. The number of components, the mixture component parameters and the component weights are modelled jointly, no initial parameter estimates are required, the approach is numerically stable, the number of components has no influence upon the convergence and the speed of convergence is very high. The alternative perspective is compared to the EM algorithm and verified through several data sets. The presented algorithm showed significant advantages compared to the competitive methods and has already been successfully applied in reliability and fatigue analyses.  相似文献   

14.
The software reliability modeling is of great significance in improving software quality and managing the software development process. However, the existing methods are not able to accurately model software reliability improvement behavior because existing single model methods rely on restrictive assumptions and combination models cannot well deal with model uncertainties. In this article, we propose a Bayesian model averaging (BMA) method to model software reliability. First, the existing reliability modeling methods are selected as the candidate models, and the Bayesian theory is used to obtain the posterior probabilities of each reliability model. Then, the posterior probabilities are used as weights to average the candidate models. Both Markov Chain Monte Carlo (MCMC) algorithm and the Expectation–Maximization (EM) algorithm are used to evaluate a candidate model's posterior probability and for comparison purpose. The results show that the BMA method has superior performance in software reliability modeling, and the MCMC algorithm performs better than EM algorithm when they are used to estimate the parameters of BMA method.  相似文献   

15.
This article presents an innovative framework regarding an inverse problem. One presents the extension of a global optimization algorithm to estimate not only an optimal set of modeling parameters, but also their optimal distributions. Regarding its characteristics, differential evolution algorithm is used to demonstrate this extension, although other population-based algorithms may be considered. The adaptive empirical distributions algorithm is here introduced for the same purpose. Both schemes rely on the minimization of the dissimilarity between the empirical cumulative distribution functions of two data sets, using a goodness-of-fit test to evaluate their resemblance.  相似文献   

16.
This paper proposes an efficient metamodeling approach for uncertainty quantification of complex system based on Gaussian process model (GPM). The proposed GPM‐based method is able to efficiently and accurately calculate the mean and variance of model outputs with uncertain parameters specified by arbitrary probability distributions. Because of the use of GPM, the closed form expressions of mean and variance can be derived by decomposing high‐dimensional integrals into one‐dimensional integrals. This paper details on how to efficiently compute the one‐dimensional integrals. When the parameters are either uniformly or normally distributed, the one‐dimensional integrals can be analytically evaluated, while when parameters do not follow normal or uniform distributions, this paper adopts the effective Gaussian quadrature technique for the fast computation of the one‐dimensional integrals. As a result, the developed GPM method is able to calculate mean and variance of model outputs in an efficient manner independent of parameter distributions. The proposed GPM method is applied to a collection of examples. And its accuracy and efficiency is compared with Monte Carlo simulation, which is used as benchmark solution. Results show that the proposed GPM method is feasible and reliable for efficient uncertainty quantification of complex systems in terms of the computational accuracy and efficiency. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

17.
Many highly reliable products usually have complex structure, with their reliability being evaluated by two or more performance characteristics. In certain physical situations, the degradation of these performance characteristics would be always positive and strictly increasing. In such a case, the gamma process is usually considered as a degradation process due to its independent and non-negative increments properties. In this paper, we suppose that a product has two dependent performance characteristics and that their degradation can be modeled by gamma processes. For such a bivariate degradation involving two performance characteristics, we propose to use a bivariate Birnbaum-Saunders distribution and its marginal distributions to approximate the reliability function. Inferential method for the corresponding model parameters is then developed. Finally, for an illustration of the proposed model and method, a numerical example about fatigue cracks is discussed and some computational results are presented.  相似文献   

18.
We consider the problem of comparing two multivariate compound distributions each with component densities f 1, (x) and f 2(x) and with mixing proportions p x and (1 – px) for the first compound distribution and py and (1 – py ) for the second. The likelihood ratio test of the hypothesis H 0: px = py is presented by using the EM algorithm to derive the maximum likelihood estimates. This test is applied to some actual data under the assumption that the underlying component densities are normal. The result is contrasted with results obtained using some existing univariate methods of testing H0.  相似文献   

19.
The growing interest in non-azeotropic refrigerant mixtures for heat pumps and appliances has led to the need for a method of analysing the refrigeration cycle properties of such mixtures. Thermodynamic property tables are not sufficient, because they give no information about the two-phase region, where temperature and vapour and liquid compositions are not constant at constant pressure. This paper describes the computer program CYCLE, which can perform thermodynamic property calculations for subcooled, two-phase, and superheated non-azeotropic mixtures and can analyse a simple refrigerating cycle.  相似文献   

20.
This article describes how the Jacobian is found for certain functions of a singular random matrix, both in the general case and in that of a non-negative, definite random matrix. The Jacobian of the transformationV=S 2 is found whenS is non-negative definite; in addition, the Jacobian of the transformationY=X + is determined whenX + is the generalized, or Moore-Penrose, inverse ofX. Expressions for the densities of the generalized inverse of the central beta and F singular random matrices are proposed. Finally, two applications in the field of Bayesian inference are presented. This work was supported in part by the research project 39017E of CONACYT-México This article was written during the first author's stay as a Visiting Professor in the Department of Statistics at the University of Granada, Spain  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号