首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
For estimating the common mean of two normal populations with unknown and possibly unequal variances the well-known Graybill-Deal estimator (GDE) has been a motivating factor for research over the last five decades. Surprisingly the literature does not have much to show when it comes to the maximum likelihood estimator (MLE) and its properties compared to those of the GDE. The purpose of this note is to shed some light on the structure of the MLE, and compare it with the GDE. While studying the asymptotic variance of the GDE, we provide an upgraded set of bounds for its variance. A massive simulation study has been carried out with very high level of accuracy to compare the variances of the above two estimators results of which are quite interesting.  相似文献   

2.
Simple nonparametric estimators of the conditional distribution of a response variable given a continuous covariate are often useful in survival analysis. Since a few nonparametric estimation options are available, a comparison of the performance of these options may be of value to determine which approach to use in a given application. In this note, we compare various nonparametric estimators of the conditional survival function when the response is subject to interval- and right-censoring. The estimators considered are a generalization of Turnbull’s estimator proposed by Dehghan and Duchesne (2011) and two nonparametric estimators for complete or right-censored data used in conjunction with imputation methods, namely the Nadaraya-Watson and generalized Kaplan-Meier estimators. We study the finite sample integrated mean squared error properties of all these estimators by simulation and compare them to a semi-parametric estimator. We propose a rule-of-thumb based on simple sample summary statistics to choose the most appropriate among these estimators in practice.  相似文献   

3.
For estimating the common mean of two normal populations with unknown and possibly unequal variances the well-known Graybill–Deal estimator (GDE) has been a motivating factor for research over the last five decades. Surprisingly the literature does not have much to show when it comes to the maximum likelihood estimator (MLE) and its properties compared to those of the GDE. The purpose of this note is to shed some light on the structure of the MLE, and compare it with the GDE. While studying the asymptotic variance of the GDE, we provide an upgraded set of bounds for its variance. A massive simulation study has been carried out with very high level of accuracy to compare the variances of the above two estimators results of which are quite interesting.  相似文献   

4.
5.
We consider the problem of estimation of the parameters of the Marshall-Olkin Bivariate Weibull distribution in the presence of random censoring. Since the maximum likelihood estimators of the parameters cannot be expressed in a closed form, we suggest an EM algorithm to compute the same. Extensive simulations are carried out to conclude that the estimators perform efficiently under random censoring.  相似文献   

6.
7.
8.
9.
In order to improve a parallel program's performance it is critical to evaluate how even the work contained in a program is distributed over all processors dedicated to the computation. Traditional work distribution analysis is commonly performed at the machine level. The disadvantage of this method is that it cannot identify whether the processors are performing useful or redundant (replicated) work. The paper describes a novel method of statically estimating the useful work distribution of distributed-memory parallel programs at the program level, which carefully distinguishes between useful and redundant work. The amount of work contained in a parallel program, which correlates with the number of loop iterations to be executed by each processor, is estimated by accurately modeling loop iteration spaces, array access patterns and data distributions. A cost function defines the useful work distribution of loops, procedures and the entire program. Lower and upper bounds of the described parameter are presented. The computational complexity of the cost function is independent of the program's problem size, statement execution and loop iteration counts. As a consequence, estimating the work distribution based on the described method is considerably faster than simulating or actually compiling and executing the program. Automatically estimating the useful work distribution is fully implemented as part of P3T, which is a static parameter based performance prediction tool under the Vienna Fortran Compilation System (VFCS). The Lawrence Livermore Loops are used as a test case to verify the approach.  相似文献   

10.
The Cox model with frailties has been popular for regression analysis of clustered event time data under right censoring. However, due to the lack of reliable computation algorithms, the frailty Cox model has been rarely applied to clustered current status data, where the clustered event times are subject to a special type of interval censoring such that we only observe for each event time whether it exceeds an examination (censoring) time or not. Motivated by the cataract dataset from a cross-sectional study, where bivariate current status data were observed for the occurrence of cataracts in the right and left eyes of each study subject, we develop a very efficient and stable computation algorithm for nonparametric maximum likelihood estimation of gamma-frailty Cox models with clustered current status data. The algorithm proposed is based on a set of self-consistency equations and the contraction principle. A convenient profile-likelihood approach is proposed for variance estimation. Simulation and real data analysis exhibit the nice performance of our proposal.  相似文献   

11.
12.
We consider the problem of the computer generation of a random variable X with a given characteristic function when the corresponding density and distribution function are not explicitly known or have complicated explicit formulas. Under mild conditions on the characteristic function, we propose and analyze a rejection/squeeze algorithm which requires the evaluation of one integral at a crucial stage.  相似文献   

13.
The maximum likelihood estimate of a vector, given noisy observations of linear combinations of the vector's components, is a function of the covariance matrices of the noise. Often the matrices are not exactly known, and consequently the maximum likelihood estimate will be in error. An algorithm is developed for computing the covariances of the errors in the maximum likelihood estimate due to uncertainties in the noise covariance matrices. It is assumed that the uncertainties are small and can be described statistically.  相似文献   

14.
We present a second order statistical analysis of the 2D Discrete Wavelet Transform (2D DWT) coefficients. The input images are considered as wide sense bivariate random processes. We derive closed form expressions for the wavelet coefficientsʼ correlation functions in all possible scenarios: inter-scale and inter-band, inter-scale and intra-band, intra-scale and inter-band and intra-scale and intra-band. The particularization of the input process to the White Gaussian Noise (WGN) case is considered as well. A special attention is paid to the asymptotical analysis obtained by considering an infinite number of decomposition levels. Simulation results are also reported, confirming the theoretical results obtained. The equations derived, and especially the inter-scale and intra-band dependency of the 2D DWT coefficients, are useful for the design of different signal processing systems as for example image denoising algorithms. We show how to apply our theoretical results for designing state of the art denoising systems which exploit the 2D DWT.  相似文献   

15.
This paper discusses a new algorithm to locate the global maximum of a function defined in a multidimensional rectangular domain. The number of dimensions is as large as, or even more than, 5 or 10. There are two important elements in this algorithm. One is the transformation of the object function in such a way that its global maximum corresponds to infinity while other secondary maxima are reduced to zero. Actually there is some departure from the ideal transformation because of possible overflow on the computer. This portion of the algorithm precedes the interactive (or conversational) use of a graphic display system. This interactive part makes the other element of the algorithm. A multidimensional point is represented as a curve on the display screen. By projecting numerous points in the multidimensional space to similarly numerous curves on the screen of the graphic display device, the human eye can make overall recognition much more efficiently than computers. This fact is exploited to reduce the problem to that of a set of unimodal peaks. Once the supporting domain for each of these peaks is separated by visual aid, one may leave the computer to handle the rest of the problem for itself. A number of numerical experiments are done and discussed to provide evidence regarding the feasibility of the proposed algorithm.  相似文献   

16.
17.
This work presents a comparative study of the performance of the cumulative sum (CuSum), as well as the exponentially weighted moving average (EWMA) control charts. The objective of this research is to verify when CuSum and EWMA control charts do the best control region, in order to detect small changes in the process average. Starting from the data of a productive process, several series were simulated. CuSum and EWMA control charts were used to determine the average run length (ARL) to detect a condition out of control. ARL found by each chart which was then, compared. It was observed that the CuSum control chart practically did not sign points out of control for the levels of variation between ±1.0 standard deviation. For these variation levels the EWMA control chart was more efficient than CuSum. Among the parameters EWMA control chart the ones with constant λ=0.10 and 0.05, with the respective control limits L=2.814 and 2.625, were the ones that detected larger number of altered positions.  相似文献   

18.
ABSTRACT

We present a finite difference method to solve a system of two Partial-Integro Differential Equations which arise from pricing an option under a Jump-Telegraph Diffusion Model for the underlying asset, considering the risk-neutral valuation formula under an equivalent martingale measure. This system is fully discretized using an Implicit–Explicit two-time level scheme and quadrature formulas. The resulting two tridiagonal algebraic linear systems are solved recursively using the Thomas Algorithm. Some numerical results are presented and the numerical order of convergence for the method is estimated. Finally, the robustness of the method is validated against an exact solution obtained for a perturbed problem.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号