首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Thirteen methods for computing binomial confidence intervals are compared based on their coverage properties, widths and errors relative to exact limits. The use of the standard textbook method, x/n +/- 1.96 square root of [(x/n)(1-x/n)/n], or its continuity corrected version, is strongly discouraged. A commonly cited rule of thumb stating that alternatives to exact methods may be used when the estimated proportion p is such that np and n(1(-)p) both exceed 5 does not ensure adequate accuracy. Score limits are easily calculated from closed form solutions to quadratic equations and can be used at all times. Based on coverage functions, the continuity corrected score method is recommended over exact methods. Its conservative nature should be kept in mind, as should the wider fluctuation of actual coverage that accompanies omission of the continuity correction.  相似文献   

2.
Several existing unconditional methods for setting confidence intervals for the difference between binomial proportions are evaluated. Computationally simpler methods are prone to a variety of aberrations and poor coverage properties. The closely interrelated methods of Mee and Miettinen and Nurminen perform well but require a computer program. Two new approaches which also avoid aberrations are developed and evaluated. A tail area profile likelihood based method produces the best coverage properties, but is difficult to calculate for large denominators. A method combining Wilson score intervals for the two proportions to be compared also performs well, and is readily implemented irrespective of sample size.  相似文献   

3.
A confidence interval (CI) for a population predictor weight for use with N. Cliff's (1994) method of ordinal multiple regression (OMR) is presented. The OMR CI is based on an estimated standard error of a weight derived from a fixed-effects model. A simulation was performed to examine the sampling properties of the OMR CI. The results show that the OMR CI had good Type I error rate and coverage. The OMR CI had lower power than the least-squares multiple regression (LSMR) CI when predictors were not correlated but had higher power when predictor correlations were moderate to high. In addition to discussing the simulation results, it is pointed out that the OMR CI can have superior sampling properties when the fixed-effects assumptions are violated. The OMR CI is recommended when a researcher wants to consider only ordinal information in multivariate prediction, when predictor correlations are moderate to high, and when the assumptions of fixed-effects LSMR are violated. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Suppose the number of 2 x 2 tables is large relative to the average table size, and the observations within a given table are dependent, as occurs in longitudinal or family-based case-control studies. We consider fitting regression models to the odds ratios using table-level covariates. The focus is on methods to obtain valid inferences for the regression parameters beta when the dependence structure is unknown. In this setting, Liang (1985, Biometrika 72, 678-682) has shown that inference based on the noncentral hypergeometric likelihood is sensitive to misspecification of the dependence structure. In contrast, estimating functions based on the Mantel-Haenszel method yield consistent estimators of beta. We show here that, under the estimating function approach, Wald's confidence interval for beta performs well in multiplicative regression models but unfortunately has poor coverage probabilities when an additive regression model is adopted. As an alternative to Wald inference, we present a Mantel-Haenszel quasi-likelihood function based on integrating the Mantel-Haenszel estimating function. A simulation study demonstrates that, in medium-sized samples, the Mantel-Haenszel quasi-likelihood approach yields better inferences than other methods under an additive regression model and inferences comparable to Wald's method under a multiplicative model. We illustrate the use of this quasi-likelihood method in a study of the familial risk of schizophrenia.  相似文献   

5.
In a screening programme for cervical cancer, coverage of the target population is a major determinant of effectiveness and cost-effectiveness and is one of the parameters for programme monitoring recommended by the "European Guidelines for Quality Assurance". An organised screening programme was started in Turin, Italy, in 1992. Spontaneous screening was already largely present, but coverage (proportion of women who had at least a test within 3 years) was low (< 50%) and distribution of smears uneven. No comprehensive registration of spontaneous smears was available. All women were invited for the first round, independently of their previous test history. Coverage was estimated by integrating routine data from the organised programme with data on spontaneous screening obtained by interviews of a random sample of 268 non-compliers to invitation and 167 compliers. Overall (spontaneous + organised) coverage was estimated to be 74% (95% CI, 71-78%). The proportion of the target population covered as an effect of invitation was estimated to be 17% (95% CI, 15-20%). Invitations were successful in increasing coverage in previously poorly screened groups. Although 20-25% of compliers was estimated to have had further tests before the end of the round, we estimated that switching to a 3-year interval saved approximately 0.26 tests per complier. This suggests that invitations to an organised programme even to previously covered women, can be a cost-effective policy. Our method of estimating overall coverage can be useful in many other European areas where a comprehensive registration of smears is not available.  相似文献   

6.
In a meta-analysis of a set of clinical trials, a crucial but problematic component is providing an estimate and confidence interval for the overall treatment effect theta. Since in the presence of heterogeneity a fixed effect approach yields an artificially narrow confidence interval for theta, the random effects method of DerSimonian and Laird, which incorporates a moment estimator of the between-trial components of variance sigma B2, has been advocated. With the additional distributional assumptions of normality, a confidence interval for theta may be obtained. However, this method does not provide a confidence interval for sigma B2, nor a confidence interval for theta which takes account of the fact that sigma B2 has to be estimated from the data. We show how a likelihood based method can be used to overcome these problems, and use profile likelihoods to construct likelihood based confidence intervals. This approach yields an appropriately widened confidence interval compared with the standard random effects method. Examples of application to a published meta-analysis and a multicentre clinical trial are discussed. It is concluded that likelihood based methods are preferred to the standard method in undertaking random effects meta-analysis when the value of sigma B2 has an important effect on the overall estimated treatment effect.  相似文献   

7.
This article presents a generalization of the Score method of constructing confidence intervals for the population proportion (E. B. Wilson, 1927) to the case of the population mean of a rating scale item. A simulation study was conducted to assess the properties of the Score confidence interval in relation to the traditional Wald (A. Wald, 1943) confidence interval under a variety of conditions, including sample size, number of response options, extremeness of the population mean, and kurtosis of the response distribution. The results of the simulation study indicated that the Score interval usually outperformed the Wald interval, suggesting that the Score interval is a viable method of constructing confidence intervals for the population mean of a rating scale item. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Various methods exist for assessing the safety of a structural or mechanical system that has uncertain parameters. These methods are either statistical (probabilistic), in which case the probability of failure is sought, or deterministic (possibilistic), in which case bounds on the response are sought. Well-known statistical methods include the first-order reliability method (FORM) and the second-order reliability method (SORM), while deterministic methods include interval analysis, convex modeling, and fuzzy set theory (although the categorization of the latter approach as a deterministic method is debatable). The development of probabilistic and possibilistic methods has tended to occur independently, with specialized algorithms being developed for the implementation of each technique. It is shown here that a wide range of probabilistic and possibilistic methods can be encompassed by a single mathematical algorithm, so that, for example, existing codes for FORM and SORM can potentially be employed for other methods, thus allowing the designer to readily choose the method most suited to the available data. A second common algorithm is also derived, and the analysis is illustrated by application to a simple system composed of N structural components in series.  相似文献   

9.
Multistage models are frequently applied in carcinogenic risk assessment. In their simplest form, these models relate the probability of tumor presence to some measure of dose. These models are then used to project the excess risk of tumor occurrence at doses frequently well below the lowest experimental dose. Upper confidence limits on the excess risk associated with exposures at these doses are then determined. A likelihood-based method is commonly used to determine these limits. We compare this method to two computationally intensive "bootstrap" methods for determining the 95% upper confidence limit on extra risk. The coverage probabilities and bias of likelihood-based and bootstrap estimates are examined in a simulation study of carcinogenicity experiments. The coverage probabilities of the nonparametric bootstrap method fell below 95% more frequently and by wider margins than the better-performing parametric bootstrap and likelihood-based methods. The relative bias of all estimators are seen to be affected by the amount of curvature in the true underlying dose-response function. In general, the likelihood-based method has the best coverage probability properties while the parametric bootstrap is less biased and less variable than the likelihood-based method. Ultimately, neither method is entirely satisfactory for highly curved dose-response patterns.  相似文献   

10.
An experiment to assess the efficacy of a particular treatment or process often produces dichotomous responses, either favourable or unfavourable. When we administer the treatment on two occasions to the same subjects, we often use McNemar's test to investigate the hypothesis of no difference in the proportions on the two occasions, that is, the hypothesis of marginal homogeneity. A disadvantage in using McNemar's statistic is that we estimate the variance of the sample difference under the restriction that the marginal proportions are equal. A competitor to McNemar's statistic is a Wald statistic that uses an unrestricted estimator of the variance. Because the Wald statistic tends to reject too often in small samples, we investigate an adjusted form that is useful for constructing confidence intervals. Quesenberry and Hurst and Goodman discussed methods of construction that we adapt for constructing confidence intervals for the differences in correlated proportions. We empirically compare the coverage probabilities and average interval lengths for the competing methods through simulation and give recommendations based on the simulation results.  相似文献   

11.
提出了一种基于移频技术的短时傅里叶变换阶比分析算法.该算法利用傅里叶变换在频域的卷积性质,对原始信号在时域乘以e-j2πfit使fi的频谱能量搬迁到零频处,按一定的频率间隔改变fi就可以在零频处得到其他频率的频谱能量,以此来提高短时傅里叶变换在时频分析中的频率分辨率.然后在时频面上进行局部阈值降噪,同时跟踪转速的变化,最终应用到变速机械的阶比分析中.与短时傅里叶变换分析结果对比表明,本文方法可以更加准确地跟踪到实际的转速.实际降速过程中轴承信号利用本文方法进行阶比分析,成功提取到轴承的故障特征频率.   相似文献   

12.
煤炭生产不可避免地会对生态环境造成影响,在煤炭生产中,生态环境监测是经济可持续发展的重要环节.目前煤矿生态监测的最常见的方法是基于归一化植被指数(NDVI)的植被覆盖度计算,但在对露天矿的植被监测实验中,基于NDVI的植被覆盖度计算结果出现了的误差.为了给草原矿区生态监测提供合适的方法,本文利用Sentinel-2数据采取遥感波段反演的方法计算了研究区的NDVI.并采取实证对比的方法对胜利、平朔矿区的NDVI分布特征进行了研究.结果表明:NDVI在具有一定植被覆盖的区域能够良好的反映地表植被覆盖情况,但在矿区内部被煤炭覆盖的区域可能会出现一定程度的误差.误差现象在两个研究区均会出现,且在胜利矿区影响更加严重.推测误差现象出现的原因是由于NDVI归一化算法的不足导致仅使用NDVI为参数无法区分光谱曲线具有相似特征的煤炭覆盖区域和中低覆盖草地,因而建议在矿区植被监测中将相关区域进行掩膜处理或更换植被指数以避免此现象的影响.  相似文献   

13.
We developed a method for the estimation of three-dimensional acetabular coverage of the femoral head with use of only an anteroposterior radiograph of the hip. This technique also allows recalculation of the corrected value for coverage at neutral pelvic tilt. Provided that the acetabulum and femoral head are spherical and congruent, the results are as accurate as those obtained with computerized tomographic reconstruction, the dose of radiation is much lower, and much less time is required for calculation. The hips of 286 normal subjects showed increases with age in both anterior and posterior coverage and backward tilt of the pelvis, along with a decrease in the anterior-posterior ratio of coverage. The proportion of anterior acetabular coverage in female subjects was smaller than that in male subjects. There was more backward tilt of the pelvis and the anterior-posterior ratio of coverage was smaller when the subjects were standing than when they were supine.  相似文献   

14.
In survival analysis, estimates of median survival times in homogeneous samples are often based on the Kaplan-Meier estimator of the survivor function. Confidence intervals for quantiles, such as median survival, are typically constructed via large sample theory or the bootstrap. The former has suspect accuracy for small sample sizes under moderate censoring and the latter is computationally intensive. In this paper, improvements on so-called test-based intervals and reflected intervals (cf., Slud, Byar, and Green, 1984, Biometrics 40, 587-600) are sought. Using the Edgeworth expansion for the distribution of the studentized Nelson-Aalen estimator derived in Strawderman and Wells (1997, Journal of the American Statistical Association 92), we propose a method for producing more accurate confidence intervals for quantiles with randomly censored data. The intervals are very simple to compute, and numerical results using simulated data show that our new test-based interval outperforms commonly used methods for computing confidence intervals for small sample sizes and/or heavy censoring, especially with regard to maintaining specified coverage.  相似文献   

15.
The relationship between the concentration of the analyte and the imprecision of an analytical method can be displayed by the precision profile in which the coefficient of variation (relative standard deviation) is plotted against the concentration of the analyte. The function of the curve of the profile and its confidence limits can easily be assessed by a computer program developed by W.A. Sadler & M.H. Smith (Clin. Chem. 36 (1990), 1346-1350). For the assessment of limits of detection and of quantification the following procedure is proposed: The lower (and upper) limit of the measuring interval is defined by the point at which an acceptable CV-line intersects the confidence limit. If, in the variance function one sets the concentration to zero, the normal distribution of the random errors of the blank will result. The mean of the next adjacent normal distribution, following the variance formula and overlapping the "zero-distribution" by a defined amount, represents the limit of detection. Within the described measuring interval, or within a fraction of it, one might construct overlapping normal distributions in an analogous manner. Their number represents the "power of definition" (PD) (instead of the "analytical sensitivity"), which also depends on the concentration of the determinand according to the variance function. We tested these hypotheses by a comparison of two methods for the determination of cyclosporin A (ciclosporin, INN). Our results demonstrate that the data of the lower limits of the measuring interval and of the limit of detection agree well with data from the literature obtained in extensive interlaboratory surveys.  相似文献   

16.
利用Gleeble3500试验机研究汽车用C-Mn-Al系TRIP钢的高温力学性能,测定了零塑性温度和零强度温度,应用差示扫描量热法测定其相变区间,采用扫描电镜和光学显微镜分析了不同拉伸温度对应的断口宏观形貌及断口附近组织组成.该钢种零塑性温度和零强度温度分别为1425℃和1430℃,第Ⅰ脆性区间为1400℃-熔点,第Ⅲ脆性区间为800-925℃.第Ⅲ脆性区脆化的原因是α铁素体从γ晶界析出,试样从975℃冷却至700℃过程中,随着α铁素体析出比例的增大,断面收缩率先减小后增大.基体α铁素体比例为8.1%时(850℃),断面收缩率降至28.9%;而拉伸温度在800℃以下时,基体α铁素体比例超过16.7%,断面收缩率回升至38.5%以上.该钢种在1275.6℃时开始析出少量粗大的Al N颗粒,但对钢的热塑性没有影响.   相似文献   

17.
Surface segregation and surface tension of liquid mixtures   总被引:2,自引:0,他引:2  
A model has been developed in which surfaces are treated as separate phases with a thickness corresponding to a monolayer. It is argued that the surface tension of liquids is a measure of the excess surface chemical potential of the surface atoms relative to the bulk atoms. Equations for the calculation of the surface composition and surface tension of liquid mixtures are developed. Using only the surface tension and molar volume data of the pure components, excellent correspondence between calculated and experimental surface tension values was obtained. The method was also tested on liquid systems showing immiscibility. The surface coverage calculated from the present model is compared with that calculated using Gibbs adsorption equation. The surface coverage of the solute species increases with increasing solute concentration. However, depending on the surface properties of the system, the excess surface coverage may pass through a maximum value and then decrease with increasing solute concentration.  相似文献   

18.
Recent development in three-dimensional (3-D) imaging of cancellous bone has made possible true 3-D quantification of trabecular architecture. This provides a significant improvement of the tools available for studying and understanding the mechanical functions of cancellous bone. This article reviews the different techniques for 3-D imaging, which include serial sectioning, X-ray tomographic methods, and NMR scanning. Basic architectural features of cancellous bone are discussed, and it is argued that connectivity and architectural anisotropy (fabric) are of special interest in mechanics-architecture relations. A full characterization of elastic mechanical properties is, with traditional mechanical testing, virtually impossible, but 3-D reconstruction in combination with newly developed methods for large-scale finite element analysis allow calculations of all elastic properties at the cancellous bone continuum level. Connectivity has traditionally been approached by various 2-D methods, but none of these methods have any known relation to 3-D connectivity. A topological approach allows unbiased quantification of connectivity, and this further allows expressions of the mean size of individual trabeculae, which has previously also been approached by a number of uncertain 2-D methods. Anisotropy may be quantified by fundamentally different methods. The well-known mean intercept length method is an interface-based method, whereas the volume orientation method is representative of volume-based methods. Recent studies indicate that volume-based methods are at least as good as interface-based methods in predicting mechanical anisotropy. Any other architectural property may be quantified from 3-D reconstructions of cancellous bone specimens as long as an explicit definition of the property can be given. This challenges intuitive and vaguely defined architectural properties and forces bone scientists toward 3-D thinking.  相似文献   

19.
为了研究不同鼓风条件下块矿比例对高炉含铁炉料软熔性能的影响,计算模拟了3种鼓风条件下的温度和气体含量并使用高温熔滴炉研究了块矿比例对含铁炉料软熔性能的影响,进而进行综合炉料结构优化的分析。结果表明,富氧、加湿鼓风条件下,氢气含量增加,能有效降低炉内最大压差,窄化熔融区间,改善炉内透气性;富氧、加湿鼓风条件下,块矿比例的增加虽然会导致炉内最大压差和软熔区间的增大,但是最大压差的绝对值仍远小于基准条件下的最大压差值,软熔带宽度也小于基准条件下的宽度。可以得知,在富氧和加湿鼓风条件下适当增加块矿比例,综合炉料软熔性能仍然优于基准条件,且能降低高炉生产成本,对于炉料结构是一种有效的优化措施。  相似文献   

20.
物联网是未来赛博使能业务的重要支撑平台。蜂窝网络则被认为是广泛分布在部署区域中的物联网终端数据接入的主要渠道,尤其在广域覆盖方面具有难以替代的价值。在满足覆盖要求的条件下,降低蜂窝网络基站的下行发射功率在绿色通信方面具有重要的研究意义。由此提出了一种基于优化目标平滑近似和均方根传播策略的梯度下降算法,在满足物联网业务覆盖率的条件下最小化基站的总下行发射功率。首先,使用罚函数方法将复杂约束条件的异构蜂窝网络优化问题转化为简单约束形式的优化问题;其次,将不可导的目标函数通过平滑近似转化为可导形式,并给出其对天线下倾角和下行功率参数的梯度解析形式;最后,使用均方根传播梯度下降算法进行转化后的目标函数优化。仿真实验结果表明该算法可以在满足覆盖率指标的条件下最小化基站的总下行发射功率,与现有元启发算法和普通梯度下降算法相比,具有良好的收敛速度,并能更好地抑制优化过程中振荡。   相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号