共查询到20条相似文献,搜索用时 15 毫秒
1.
Vicente G. CanchoFranscisco Louzada-Neto Gladys D.C. Barriga 《Computational statistics & data analysis》2011,55(1):677-686
In this paper we proposed a new two-parameters lifetime distribution with increasing failure rate. The new distribution arises on a latent complementary risk problem base. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulae for its reliability and failure rate functions, quantiles and moments, including the mean and variance. A simple EM-type algorithm for iteratively computing maximum likelihood estimates is presented. The Fisher information matrix is derived analytically in order to obtaining the asymptotic covariance matrix. The methodology is illustrated on a real data set. 相似文献
2.
Boris I. Godoy Graham C. Goodwin Juan C. Agüero Damián Marelli Torbjörn WigrenAuthor vitae 《Automatica》2011,(9):1905-1915
In this paper, we present a novel algorithm for estimating the parameters of a linear system when the observed output signal is quantized. This question has relevance to many areas including sensor networks and telecommunications. The algorithms described here have closed form solutions for the SISO case. However, for the MIMO case, a set of pre-computed scenarios is used to reduce the computational complexity of EM type algorithms that are typically deployed for this kind of problem. Comparisons are made with other algorithms that have been previously described in the literature as well as with the implementation of algorithms based on the Quasi-Newton method. 相似文献
3.
X-ray pulsar navigation (XPNAV) is an attractive method for autonomous navigation of deep space in the future. Currently, techniques for estimating the phase of X-ray pulsar radiation involve the maximization of the general non-convex object functions based on the average profile fxom the epoch folding method. This results in the suppression of useful information and highly complex computation. In this paper, a new maximum likelihood (ML) phase estimation method that directly utilizes the measured time of arrivals (TOAs) is presented. The X-ray pulsar radiation will be treated as a cyclo-stationary process and the TOAs of the photons in a period will be redefined as a new process, whose probability distribution function is the normalized standard profile of the pulsar. We demonstrate that the new process is equivalent to the generally used Poisson model. Then, the phase estimation problem is recast as a cyclic shift parameter estimation under the ML estimation, and we also put forward a parallel ML estimation method to improve the ML solution. Numerical simulation results show that the estimator described here presents a higher precision and reduces the computational complexity compared with currently used estimators. 相似文献
4.
5.
We introduce a new class of long-term survival models by assuming that the number of competing causes, say , belongs to a class of mixed Poisson distributions, which are overdispersed. More specifically, we suppose that follows a Poisson distribution with mean , with , and is a positive continuous random variable belonging to the exponential family. With this, we obtain a general class for , which includes, for example: negative binomial, Poisson-inverse gaussian and Poisson generalized hyperbolic secant distributions. Therefore, our long-term survival models can be viewed as heterogeneous promotion models. We present some statistical properties of our models and show that the promotion model is obtained as a limiting case. Some special models of the proposed class are discussed in details. We consider the expected number of competing causes depending on covariates, so allowing to a direct modeling of the cure rate through covariates. Estimation by maximum likelihood and inference for the parameters of models are discussed. In particular, we state sufficient conditions for the maximum likelihood estimators to be consistent and asymptotically normally distributed. A small simulation study is presented in order to check the finite-sample behavior of the maximum likelihood estimators and to illustrate the importance of our models when significant covariates are non-observed. We analyze a real data set from a melanoma clinical trial to illustrate the potential for practice of our proposed models. 相似文献
6.
Ammar M. Sarhan David C. HamiltonBruce Smith Debasis Kundu 《Computational statistics & data analysis》2011,55(1):644-654
The two-parameter linear failure rate distribution has been used quite successfully to analyze lifetime data. Recently, a new three-parameter distribution, known as the generalized linear failure rate distribution, has been introduced by exponentiating the linear failure rate distribution. The generalized linear failure rate distribution is a very flexible lifetime distribution, and the probability density function of the generalized linear failure rate distribution can take different shapes. Its hazard function also can be increasing, decreasing and bathtub shaped. The main aim of this paper is to introduce a bivariate generalized linear failure rate distribution, whose marginals are generalized linear failure rate distributions. It is obtained using the same approach as was adopted to obtain the Marshall-Olkin bivariate exponential distribution. Different properties of this new distribution are established. The bivariate generalized linear failure rate distribution has five parameters and the maximum likelihood estimators are obtained using the EM algorithm. A data set is analyzed for illustrative purposes. Finally, some generalizations to the multivariate case are proposed. 相似文献
7.
In this paper we discuss one parameter Lindley distribution. It is suggested that it may serve as a useful reliability model. The model properties and reliability measures are derived and studied in detail. For the estimation purposes of the parameter and other reliability characteristics maximum likelihood and Bayes approaches are used. Interval estimation and coverage probability for the parameter are obtained based on maximum likelihood estimation. Monte Carlo simulation study is conducted to compare the performance of the various estimates developed. In view of cost and time constraints, progressively Type II censored sample data are used in estimation. A real data example is given for illustration. 相似文献
8.
Patrícia F. ParanaíbaEdwin M.M. Ortega Gauss M. CordeiroRodrigo R. Pescim 《Computational statistics & data analysis》2011,55(2):1118-1136
For the first time, a five-parameter distribution, the so-called beta Burr XII distribution, is defined and investigated. The new distribution contains as special sub-models some well-known distributions discussed in the literature, such as the logistic, Weibull and Burr XII distributions, among several others. We derive its moment generating function. We obtain, as a special case, the moment generating function of the Burr XII distribution, which seems to be a new result. Moments, mean deviations, Bonferroni and Lorenz curves and reliability are provided. We derive two representations for the moments of the order statistics. The method of maximum likelihood and a Bayesian analysis are proposed for estimating the model parameters. The observed information matrix is obtained. For different parameter settings and sample sizes, various simulation studies are performed and compared in order to study the performance of the new distribution. An application to real data demonstrates that the new distribution can provide a better fit than other classical models. We hope that this generalization may attract wider applications in reliability, biology and lifetime data analysis. 相似文献
9.
GFREG: a computer program for maximum likelihood regression using the Generalized F distribution 总被引:1,自引:0,他引:1
A FORTRAN program is described for maximum likelihood estimation within the Generalized F family of distributions. It can be used to estimate regression parameters in a log-linear model for censored survival times with covariates, for which the error distribution may have a great variety of shapes, including most distributions of current use in biostatistics. The optimization is performed by an algorithm based on the generalized reduced gradient method. A stepwise variable search algorithm for covariate selection is included in the program. Output features include: model selection criteria, standard errors of parameter estimates, quantile and survival rates with their standard errors, residuals and several plots. An example based on data from Princess Margaret Hospital, Toronto, is discussed to illustrate the program's capabilities. 相似文献
10.
The estimates, via maximum likelihood, moment method and probability plot, of the parameters in the generalized exponential distribution under progressive type-I interval censoring are studied. A simulation is conducted to compare these estimates in terms of mean squared errors and biases. Finally, these estimate methods are applied to a real data set based on patients with plasma cell myeloma in order to demonstrate the applicabilities. 相似文献
11.
Consider a multipath signal whose individual signals are deterministic known increasing or decreasing pulses. The problem is to estimate the amplitudes and delay times of individual signals. Several methods have been devoted to the solution of these unknown parameters. The maximum likelihood (ML) estimation and the estimate maximize (EM) algorithm are commonly used, but they are computationally intensive and still insufficient to obtain accurate estimations. The method can provide a quick and accurate estimate of the amplitudes and arrival (delay) times, even in the closely spaced multipaths and heavy noise. 相似文献
12.
Nicolas Wicker Jean Muller Ravi Kiran Reddy Kalathur 《Computational statistics & data analysis》2008,52(3):1315-1322
Dirichlet distributions are natural choices to analyse data described by frequencies or proportions since they are the simplest known distributions for such data apart from the uniform distribution. They are often used whenever proportions are involved, for example, in text-mining, image analysis, biology or as a prior of a multinomial distribution in Bayesian statistics. As the Dirichlet distribution belongs to the exponential family, its parameters can be easily inferred by maximum likelihood. Parameter estimation is usually performed with the Newton-Raphson algorithm after an initialisation step using either the moments or Ronning's methods. However this initialisation can result in parameters that lie outside the admissible region. A simple and very efficient alternative based on a maximum likelihood approximation is presented. The advantages of the presented method compared to two other methods are demonstrated on synthetic data sets as well as for a practical biological problem: the clustering of protein sequences based on their amino acid compositions. 相似文献
13.
The EM algorithm is a powerful technique for determining the maximum likelihood estimates (MLEs) in the presence of binary data since the maximum likelihood estimators of the parameters cannot be expressed in a closed-form. In this paper, we consider one-shot devices that can be used only once and are destroyed after use, and so the actual observation is on the conditions rather than on the real lifetimes of the devices under test. Here, we develop the EM algorithm for such data under the exponential distribution for the lifetimes. Due to the advances in manufacturing design and technology, products have become highly reliable with long lifetimes. For this reason, accelerated life tests are performed to collect useful information on the parameters of the lifetime distribution. For such a test, the Bayesian approach with normal prior was proposed recently by Fan et al. (2009). Here, through a simulation study, we show that the EM algorithm and the mentioned Bayesian approach are both useful techniques for analyzing such binary data arising from one-shot device testing and then make a comparative study of their performance and show that, while the Bayesian approach is good for highly reliable products, the EM algorithm method is good for moderate and low reliability situations. 相似文献
14.
对于网络质量评估链路性能推测无疑是至关重要的,然而现有的估计方法通常只能推测层次数有限的简单网络,无法应用于大规模网络。提出了一种基于不完整数据极大似然估计算法,估计网络内部链路时延分布,该方法通过不同的发包策略将树状网络拓扑划分成不同的两层三链子树,针对每个子树估计每条"链"的时延,随后通过移植算法将路径时延划分到各链路中,逐一对每个子树使用该方法计算从而得到整个网络链路时延情况。利用NS2仿真实验验证了该算法的可行性和准确性。 相似文献
15.
Alice Lemos Morais Wagner Barreto-Souza 《Computational statistics & data analysis》2011,55(3):1410-1425
In this paper we introduce the Weibull power series (WPS) class of distributions which is obtained by compounding Weibull and power series distributions, where the compounding procedure follows same way that was previously carried out by Adamidis and Loukas (1998). This new class of distributions has as a particular case the two-parameter exponential power series (EPS) class of distributions (Chahkandi and Ganjali, 2009), which contains several lifetime models such as: exponential geometric (Adamidis and Loukas, 1998), exponential Poisson (Kus, 2007) and exponential logarithmic (Tahmasbi and Rezaei, 2008) distributions. The hazard function of our class can be increasing, decreasing and upside down bathtub shaped, among others, while the hazard function of an EPS distribution is only decreasing. We obtain several properties of the WPS distributions such as moments, order statistics, estimation by maximum likelihood and inference for a large sample. Furthermore, the EM algorithm is also used to determine the maximum likelihood estimates of the parameters and we discuss maximum entropy characterizations under suitable constraints. Special distributions are studied in some detail. Applications to two real data sets are given to show the flexibility and potentiality of the new class of distributions. 相似文献
16.
Motivated from the stochastic representation of the univariate zero-inflated Poisson (ZIP) random variable, the authors propose a multivariate ZIP distribution, called as Type I multivariate ZIP distribution, to model correlated multivariate count data with extra zeros. The distributional theory and associated properties are developed. Maximum likelihood estimates for parameters of interest are obtained by Fisher’s scoring algorithm and the expectation–maximization (EM) algorithm, respectively. Asymptotic and bootstrap confidence intervals of parameters are provided. Likelihood ratio test and score test are derived and are compared via simulation studies. Bayesian methods are also presented if prior information on parameters is available. Two real data sets are used to illustrate the proposed methods. Under both AIC and BIC, our analysis of the two data sets supports the Type I multivariate zero-inflated Poisson model as a much less complex alternative with feasibility to the existing multivariate ZIP models proposed by Li et al. (Technometrics, 29–38, Vol 41, 1999). 相似文献
17.
Chun-Zheng Cao Jin-Guan LinXiao-Xin Zhu 《Computational statistics & data analysis》2012,56(2):438-448
It is common in epidemiology and other fields that the analyzing data is collected with error-prone observations and the variances of the measurement errors change across observations. Heteroscedastic measurement error (HME) models have been developed for such data. This paper extends the structural HME model to situations in which the observations jointly follow scale mixtures of normal (SMN) distribution. We develop the EM algorithm to compute the maximum likelihood estimates for the model with and without equation error respectively, and derive closed forms of asymptotic variances. We also conduct simulations to verify the effective of the EM estimates and confirm their robust behaviors based on heavy-tailed SMN distributions. A practical application is reported for the data from the WHO MONICA Project on cardiovascular disease. 相似文献
18.
Several univariate proportional reversed hazard models have been proposed in the literature. Recently, Kundu and Gupta (2010) proposed a class of bivariate models with proportional reversed hazard marginals. It is observed that the proposed bivariate proportional reversed hazard models have a singular component. In this paper we introduce the multivariate proportional reversed hazard models along the same manner. Moreover, it is observed that the proposed multivariate proportional reversed hazard model can be obtained from the Marshall–Olkin copula. The multivariate proportional reversed hazard models also have a singular component, and their marginals have proportional reversed hazard distributions. The multivariate ageing and the dependence properties are discussed in details. We further provide some dependence measure specifically for the bivariate case. The maximum likelihood estimators of the unknown parameters cannot be expressed in explicit forms. We propose to use the EM algorithm to compute the maximum likelihood estimators. One trivariate data set has been analysed for illustrative purposes. 相似文献
19.
J. Rodríguez-Avi A. Conde-Sánchez A.J. Sáez-Castillo M.J. Olmo-Jiménez 《Computational statistics & data analysis》2007,51(12):6138-6150
A tetraparametric univariate distribution generated by the Gaussian hypergeometric function that includes the Waring and the generalized Waring distributions as particular cases is presented. This distribution is expressed as a generalized beta type I mixture of a negative binomial distribution, in such a way that the variance of the tetraparametric model can be split into three components: randomness, proneness and liability. These results are extensions of known analogous properties of the generalized Waring distribution. Two applications in the fields of sport and economy are included in order to illustrate the utility of the new distribution compared with the generalized Waring distribution. 相似文献
20.
Artur J. Lemonte Klaus L.P. Vasconcellos 《Computational statistics & data analysis》2007,51(9):4656-4681
We develop nearly unbiased estimators for the two-parameter Birnbaum-Saunders distribution [Birnbaum, Z.W., Saunders, S.C., 1969a. A new family of life distributions. J. Appl. Probab. 6, 319-327], which is commonly used in reliability studies. We derive modified maximum likelihood estimators that are bias-free to second order. We also consider bootstrap-based bias correction. The numerical evidence we present favors three bias-adjusted estimators. Different interval estimation strategies are evaluated. Additionally, we derive a Bartlett correction that improves the finite-sample performance of the likelihood ratio test in finite samples. 相似文献