首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
The performance of model based bootstrap methods for constructing point-wise confidence intervals around the survival function with interval censored data is investigated. It is shown that bootstrapping from the nonparametric maximum likelihood estimator of the survival function is inconsistent for the current status model. A model based smoothed bootstrap procedure is proposed and proved to be consistent. In fact, a general framework for proving the consistency of any model based bootstrap scheme in the current status model is established. In addition, simulation studies are conducted to illustrate the (in)-consistency of different bootstrap methods in mixed case interval censoring. The conclusions in the interval censoring model would extend more generally to estimators in regression models that exhibit non-standard rates of convergence.  相似文献   

2.
Quantile regression offers a semiparametric approach to modeling data with possible heterogeneity. It is particularly attractive for censored responses, where the conditional mean functions are unidentifiable without parametric assumptions on the distributions. A new algorithm is proposed to estimate the regression quantile process when the response variable is subject to double censoring. The algorithm distributes the probability mass of each censored point to its left or right appropriately, and iterates towards self-consistent solutions. Numerical results on simulated data and an unemployment duration study are given to demonstrate the merits of the proposed method.  相似文献   

3.
A new technique based on Bayesian quantile regression that models the dependence of a quantile of one variable on the values of another using a natural cubic spline is presented. Inference is based on the posterior density of the spline and an associated smoothing parameter and is performed by means of a Markov chain Monte Carlo algorithm. Examples of the application of the new technique to two real environmental data sets and to simulated data for which polynomial modelling is inappropriate are given. An aid for making a good choice of proposal density in the Metropolis-Hastings algorithm is discussed. The new nonparametric methodology provides more flexible modelling than the currently used Bayesian parametric quantile regression approach.  相似文献   

4.
A procedure for efficient estimation of the trimmed mean of a random variable conditional on a set of covariates is proposed. For concreteness, the focus is on a financial application where the trimmed mean of interest corresponds to the conditional expected shortfall, which is known to be a coherent risk measure. The proposed class of estimators is based on representing the estimator as an integral of the conditional quantile function. Relative to the simple analog estimator that weights all conditional quantiles equally, asymptotic efficiency gains may be attained by giving different weights to the different conditional quantiles while penalizing excessive departures from uniform weighting. The approach presented here allows for either parametric or nonparametric modeling of the conditional quantiles and the weights, but is essentially nonparametric in spirit. The asymptotic properties of the proposed class of estimators are established. Their finite sample properties are illustrated through a set of Monte Carlo experiments and an empirical application1.  相似文献   

5.
Geometric quantiles are investigated using data collected from a complex survey. Geometric quantiles are an extension of univariate quantiles in a multivariate set-up that uses the geometry of multivariate data clouds. A very important application of geometric quantiles is the detection of outliers in multivariate data by means of quantile contours. A design-based estimator of geometric quantiles is constructed and used to compute quantile contours in order to detect outliers in both multivariate data and survey sampling set-ups. An algorithm for computing geometric quantile estimates is also developed. Under broad assumptions, the asymptotic variance of the quantile estimator is derived and a consistent variance estimator is proposed. Theoretical results are illustrated with simulated and real data.  相似文献   

6.
Observing recurrent event processes at discrete, possibly random times produces panel count data. Modeling panel count data is challenging because the event process may be associated with the observation pattern and censoring time. Various methods have been proposed to fit flexible semiparametric regression models, but no software is available to practitioners. We develop an R package spef that fits semiparametric regression models for panel count data. Existing methods in the literature are implemented as well as our recently developed estimating equations approach. Some of the implemented methods allow informative observation and censoring scheme. The package usage is illustrated with a well-known bladder tumor data set.  相似文献   

7.
The Cox model with frailties has been popular for regression analysis of clustered event time data under right censoring. However, due to the lack of reliable computation algorithms, the frailty Cox model has been rarely applied to clustered current status data, where the clustered event times are subject to a special type of interval censoring such that we only observe for each event time whether it exceeds an examination (censoring) time or not. Motivated by the cataract dataset from a cross-sectional study, where bivariate current status data were observed for the occurrence of cataracts in the right and left eyes of each study subject, we develop a very efficient and stable computation algorithm for nonparametric maximum likelihood estimation of gamma-frailty Cox models with clustered current status data. The algorithm proposed is based on a set of self-consistency equations and the contraction principle. A convenient profile-likelihood approach is proposed for variance estimation. Simulation and real data analysis exhibit the nice performance of our proposal.  相似文献   

8.
The unknown error density of a nonparametric regression model is approximated by a mixture of Gaussian densities with means being the individual error realizations and variance a constant parameter. Such a mixture density has the form of a kernel density estimator of error realizations. An approximate likelihood and posterior for bandwidth parameters in the kernel-form error density and the Nadaraya–Watson regression estimator are derived, and a sampling algorithm is developed. A simulation study shows that when the true error density is non-Gaussian, the kernel-form error density is often favored against its parametric counterparts including the correct error density assumption. The proposed approach is demonstrated through a nonparametric regression model of the Australian All Ordinaries daily return on the overnight FTSE and S&P 500 returns. With the estimated bandwidths, the one-day-ahead posterior predictive density of the All Ordinaries return is derived, and a distribution-free value-at-risk is obtained. The proposed algorithm is also applied to a nonparametric regression model involved in state-price density estimation based on S&P 500 options data.  相似文献   

9.
A class of two-step robust regression estimators that achieve a high relative efficiency for data from light-tailed, heavy-tailed, and contaminated distributions irrespective of the sample size is proposed and studied. In particular, the least weighted squares (LWS) estimator is combined with data-adaptive weights, which are determined from the empirical distribution or quantile functions of regression residuals obtained from an initial robust fit. Just like many existing two-step robust methods, the LWS estimator with the proposed weights preserves robust properties of the initial robust estimate. However, contrary to the existing methods and despite the data-dependent weights, the first-order asymptotic behavior of LWS is fully independent of the initial estimate under mild conditions. Moreover, the proposed estimation method is asymptotically efficient if errors are normally distributed. A simulation study documents these theoretical properties in finite samples; in particular, the relative efficiency of LWS with the proposed weighting schemes can reach 85%-100% in samples of several tens of observations under various distributional models.  相似文献   

10.
Quantile regression offers great flexibility in assessing covariate effects on the response. In this article, based on the weights proposed by He and Yang (2003), we develop a new quantile regression approach for left truncated data. Our method leads to a simple algorithm that can be conveniently implemented with R software. It is shown that the proposed estimator is strongly consistent and asymptotically normal under appropriate conditions. We evaluate the finite sample performance of the proposed estimators through extensive simulation studies.  相似文献   

11.
We consider quantile regression models and investigate the induced smoothing method for obtaining the covariance matrix of the regression parameter estimates. We show that the difference between the smoothed and unsmoothed estimating functions in quantile regression is negligible. The detailed and simple computational algorithms for calculating the asymptotic covariance are provided. Intensive simulation studies indicate that the proposed method performs very well. We also illustrate the algorithm by analyzing the rainfall-runoff data from Murray Upland, Australia.  相似文献   

12.
Mixed model-based estimation of additive or geoadditive regression models has become popular throughout recent years. It provides a unified and modular framework that facilitates joint estimation of nonparametric covariate effects and the corresponding smoothing parameters. Therefore, extensions of mixed model-based inference to a Cox-type regression model for the hazard rate are considered, allowing for a combination of general censoring schemes for the survival times and a flexible, geoadditive predictor. In particular, the proposed methodology allows for arbitrary combinations of right, left, and interval censoring as well as left truncation. The geoadditive predictor comprises time-varying effects, nonparametric effects of continuous covariates, spatial effects, and potentially a number of extensions such as cluster-specific frailties or interaction surfaces. In addition, all covariates are allowed to be piecewise constant time-varying. Nonlinear and time-varying effects as well as the baseline hazard rate are modeled by penalized splines. Spatial effects can be included based on either Markov random fields or stationary Gaussian random fields. Estimation is based on a reparametrization of the model as a variance component mixed model. The variance parameters, corresponding to inverse smoothing parameters, can then be determined using an approximate marginal likelihood approach. An analysis on childhood mortality in Nigeria serves as an application, where the interval censoring framework additionally allows to deal with the problem of heaped survival times. The effect of ignoring the impact of interval-censored observations is investigated in a simulation study.  相似文献   

13.
In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data.  相似文献   

14.
In survival analysis applications, the failure rate function may frequently present a unimodal shape. In such case, the log-normal or log-logistic distributions are used. In this paper, we shall be concerned only with parametric forms, so a location-scale regression model based on the Burr XII distribution is proposed for modeling data with a unimodal failure rate function as an alternative to the log-logistic regression model. Assuming censored data, we consider a classic analysis, a Bayesian analysis and a jackknife estimator for the parameters of the proposed model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the log-logistic and log-Burr XII regression models. Besides, we use sensitivity analysis to detect influential or outlying observations, and residual analysis is used to check the assumptions in the model. Finally, we analyze a real data set under log-Burr XII regression models.  相似文献   

15.
An obvious Bayesian nonparametric generalization of ridge regression assumes that coefficients are exchangeable, from a prior distribution of unknown form, which is given a Dirichlet process prior with a normal base measure. The purpose of this paper is to explore predictive performance of this generalization, which does not seem to have received any detailed attention, despite related applications of the Dirichlet process for shrinkage estimation in multivariate normal means, analysis of randomized block experiments and nonparametric extensions of random effects models in longitudinal data analysis. We consider issues of prior specification and computation, as well as applications in penalized spline smoothing. With a normal base measure in the Dirichlet process and letting the precision parameter approach infinity the procedure is equivalent to ridge regression, whereas for finite values of the precision parameter the discreteness of the Dirichlet process means that some predictors can be estimated as having the same coefficient. Estimating the precision parameter from the data gives a flexible method for shrinkage estimation of mean parameters which can work well when ridge regression does, but also adapts well to sparse situations. We compare our approach with ridge regression, the lasso and the recently proposed elastic net in simulation studies and also consider applications to penalized spline smoothing.  相似文献   

16.
《国际计算机数学杂志》2012,89(7):1073-1082
In the context of finite population survey sampling, we propose a new model-based mean estimator, when the function that links the variables is discontinuous. The available estimators of the mean based on nonparametric regression are derived under the assumption that the regression function is continuous. We propose a new approach to adjust for the effect of discontinuity on regression estimation of the mean. The performance of the proposed estimator is analysed through a simulation study because the theoretic study of asymptotics is not possible. In the literature, the new estimator requires more computational cost than others, but the simulation experiments indicate that the proposed method has higher efficiency than other traditional parametric and nonparametric regression methods.  相似文献   

17.
Small-sample properties of a nonparametric estimator of conditional quantiles based on optimal quantization, that was recently introduced (Charlier et al., 2015), are investigated. More precisely, (i) the practical implementation of this estimator is discussed (by proposing in particular a method to properly select the corresponding smoothing parameter, namely the number of quantizers) and (ii) its finite-sample performances are compared to those of classical competitors. Monte Carlo studies reveal that the quantization-based estimator competes well in all cases and sometimes dominates its competitors, particularly when the regression function is quite complex. A real data set is also treated. While the main focus is on the case of a univariate covariate, simulations are also conducted in the bivariate case.  相似文献   

18.
In this paper, we investigate the estimation and testing problems of partially linear varying-coefficient errors-in-variables (EV) models under additional restricted condition. The restricted estimators of parametric and nonparametric components are established based on modified profile least-squares method, and their asymptotic properties are also studied under some regularity conditions. Moreover, the modified profile Lagrange multiplier test statistic is constructed under additional restricted condition. It is shown that the modified profile Lagrange multiplier test statistic is asymptotically distribution-free and follows a Chi-squared distribution under the null hypothesis. Some simulation studies are carried out to assess the performance of the proposed methods. A real dataset is analyzed for illustration.  相似文献   

19.
This paper presents a new regularized kernel-based approach for the estimation of the second order moments of stationary stochastic processes. The proposed estimator is defined by a Tikhonov-type variational problem. It contains few unknown parameters which can be estimated by cross validation solving a sequence of problems whose computational complexity scales linearly with the number of noisy moments (derived from the samples of the process). The correlation functions are assumed to be summable and the hypothesis space is a reproducing kernel Hilbert space induced by the recently introduced stable spline kernel. In this way, information on the decay to zero of the functions to be reconstructed is incorporated in the estimation process. An application to the identification of transfer functions in the case of white noise as input is also presented. Numerical simulations show that the proposed method compares favorably with respect to standard nonparametric estimation algorithms that exploit an oracle-type tuning of the parameters.  相似文献   

20.
《国际计算机数学杂志》2012,89(8):1565-1572
Recently, the estimation of a population quantile has received quite attention. Existing quantile estimators generally assume that values of an auxiliary variable are known for the entire population, and most of them are defined under simple random sampling without replacement. Assuming two-phase sampling for stratification with arbitrary sampling designs in each of the two phases, a new quantile estimator and its variance estimator are defined. The proposed estimators can be used when the population auxiliary information is not available, which is a common situation in practice. Desirable properties such as the unbiasedness are derived. Suggested estimators are compared numerically with an alternative stratification estimator and its variance estimator, and desirable results are observed. Confidence intervals based upon the proposed estimators are also defined, and they are compared via simulation studies with the confidence intervals based upon the stratification estimator. The proposed confidence intervals give desirable coverage probabilities with the smallest interval lengths.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号