首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper presents a bootstrap approach for integration of parametric and probabilistic cost estimation techniques. In the proposed method, a combination of regression analysis and bootstrap resampling technique is used to develop range estimates for construction costs. The method is applied to parametric range estimation of building projects as an example. The bootstrap approach includes advantages of probabilistic and parametric estimation methods, at the same time it requires fewer assumptions compared to classical statistical techniques. This study is of relevance to practitioners and researchers, as it provides a robust method for conceptual estimation of construction costs.  相似文献   

2.
Survival trees methods are nonparametric alternatives to the semiparametric Cox regression in survival analysis. In this paper, a tree-based method for censored survival data with time-dependent covariates is proposed. The proposed method assumes a very general model for the hazard function and is fully nonparametric. The recursive partitioning algorithm uses the likelihood estimation procedure to grow trees under a piecewise exponential structure that handles time-dependent covariates in a parallel way to time-independent covariates. In general, the estimated hazard at a node gives the risk for a group of individuals during a specific time period. Both cross-validation and bootstrap resampling techniques are implemented in the tree selection procedure. The performance of the proposed survival trees method is shown to be good through simulation and application to real data.  相似文献   

3.
The methods most commonly used for analyzing receiver operating characteristic (ROC) data incorporate "binormal" assumptions about the latent frequency distributions of test results. Although these assumptions have proved robust to a wide variety of actual frequency distributions, some data sets do not "fit" the binormal model. In such cases, resampling techniques such as the jackknife and the bootstrap provide versatile, distribution-independent, and more appropriate methods for hypothesis testing. This article describes the application of resampling techniques to ROC data for which the binormal assumptions are not appropriate, and suggests that the bootstrap may be especially helpful in determining confidence intervals from small data samples. The widespread availability of ever-faster computers has made resampling methods increasingly accessible and convenient tools for data analysis.  相似文献   

4.
Advances in testing the statistical significance of mediation effects.   总被引:1,自引:0,他引:1  
P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some other approaches. The authors describe an alternative developed by P. E. Shrout and N. Bolger (2002) based on bootstrap resampling methods. An example and step-by-step guide for performing bootstrap mediation analyses are provided. The test of joint significance is also briefly described as an alternative to both the normal theory and bootstrap methods. The relative advantages and disadvantages of each approach in terms of precision in estimating confidence intervals of indirect effects, Type I error, and Type II error are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
为了提高非平衡数据集的分类精度,提出了一种基于样本空间近邻关系的重采样算法。该方法首先根据数据集中少数类样本的空间近邻关系进行安全级别评估,根据安全级别有指导的采用合成少数类过采样技术(Synthetic minority oversampling technique,SMOTE)进行升采样;然后对多数类样本依据其空间近邻关系计算局部密度,从而对多数类样本密集区域进行降采样处理。通过以上两种手段可以均衡测试数据集,并控制数据规模防止过拟合,实现对两类样本分类的均衡化。采用十折交叉验证的方式产生训练集和测试集,在对训练集重采样之后,以核超限学习机作为分类器进行训练,并在测试集上进行验证。在UCI非平衡数据集和电路故障诊断实测数据上的实验结果表明,所提方法在整体上优于其他重采样算法。   相似文献   

6.
The identification of the input to, or kernels of, a system using nonparametric representations and least-squares estimation is becoming increasingly popular. Nonparametric representations avoid making a priori assumptions about the input or having detailed knowledge about the system, and only need to guarantee known general characteristics (for example, positivity), which are obtained through the imposition of constraints on the estimates. An often overlooked problem is how to characterize the variability of the estimates so obtained. This problem is caused by the presence of constraints--and/or the nonlinearities of the estimates, or the complexity of the (regression based) estimation algorithms used--which make standard methods of estimating variability incorrect. In this article we investigate the use of a resampling technique called the "bootstrap" to obtain the desired estimates of variability. We present real data analysis demonstrating the approach, and through simulations we test the performance of a novel bootstrap technique obtaining confidence bands for the estimated functions.  相似文献   

7.
The process of model building involved in the analysis of many medical studies may lead to a considerable amount of over-optimism with respect to the predictive ability of the 'final' regression model. In this paper we illustrate this phenomenon in a simple cutpoint model and explore to what extent bias can be reduced by using cross-validation and bootstrap resampling. These computer intensive methods are compared to an ad hoc approach and to a heuristic method. Besides illustrating all proposals with the data from a breast cancer study we perform a simulation study in order to assess the quality of the methods.  相似文献   

8.
ICP-AES法测定电镀硫酸铜中杂质元素   总被引:1,自引:0,他引:1  
黄晓芳  万银兰 《铜业工程》2005,(4):57-58,31
对电感耦合等离子体原子发射光谱法(ICP-AES法)测定电镀硫酸铜中7种杂质元素的含量进行了 研究,考察了基体元素铜对杂质元素的光谱和非光谱干扰以及基体浓度对杂质元素谱线信背比和检出限的影响。 相对标准偏差<5%,样品加标回收率为96%-106%。  相似文献   

9.
CompilationofQ-dataandSimulationofQ-spectrainRareEarthElementsAnalysisbyUsingICP-AESHuoDeng-Wei(霍登伟),YinXiang-Lian(尹香莲),ZhaoG...  相似文献   

10.
Bootstrapping is introduced as a method for approximating the standard errors of validity generalization (VG) estimates. A Monte Carlo study was conducted to evaluate the accuracy of bootstrap validity-distribution parameter estimates, bootstrap standard error estimates, and nonparametric bootstrap confidence intervals. In the simulation study the authors manipulated the sample sizes per correlation coefficient, the number of coefficients per VG analysis, and the variance of the distribution of true correlation coefficients. The results indicate that the standard error estimates produced by the bootstrapping procedure were very accurate. It is recommended that the bootstrap standard-error estimates and confidence intervals be used in the interpretation of the results of VG analyses. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Dutch listeners were exposed to the English theta sound (as in bath), which replaced [f] in /f/-final Dutch words or, for another group, [s] in /s/-final words. A subsequent identity-priming task showed that participants had learned to interpret theta as, respectively, /f/ or /s/. Priming effects were equally strong when the exposure sound was an ambiguous [fs]-mixture and when primes contained unambiguous fricatives. When the exposure sound was signal-correlated noise, listeners interpreted it as the spectrally similar /f/, irrespective of lexical bias during exposure. Perceptual learning about speech is thus constrained by spectral similarity between the input and established phonological categories, but within those limits, adjustments are thorough enough that even nonnative sounds can be treated fully as native sounds. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Preliminary data analysis in the identification of multidimensional discrete–continuous processes is considered. A method is proposed for generating a working sample from an initial training sample consisting of normal operating data. The method somewhat resembles the bootstrap process. In the present case, the process begins with a training sample that reflects the properties of the object to be identified. By means of the proposed method, the unknown stochastic dependence at the limit of definition of the corresponding input–output variables for the object may be automatically derived. The identification of the oxygen-converter process in converter shop 2 at OAO EVRAZ ZSMK is considered in the case with insufficient available information and gaps in the observation sample. The model is based on a new working sample containing both the measurements and data generated by the proposed method. By using the working sample as a training sample, the precision of identification is doubled.  相似文献   

13.
采用熔融法制样,建立了测定化工产品钾冰晶石中氟、铝、钾、钠、氧化铁、氧化钛、氧化镁、氧化钙及硫酸根的X射线荧光光谱(XRF)分析方法。样品的熔融试验发现,以四硼酸锂和偏硼酸锂混合熔剂[m(四硼酸锂)∶m(偏硼酸锂)=67∶33]作熔剂,当样品与熔剂的稀释比为1.5∶10,以1滴饱和LiBr溶液为脱模剂,在1 000 ℃下熔融10 min时制样效果最佳。使用理论α系数法和经验系数法相结合的方法对谱线重叠及元素间的吸收增强效应进行校正。在没有国家标准样品的条件下,采用高纯的化学物质按不同比例混合制成的校准样品绘制校准曲线,其线性范围宽。精密度试验结果发现,各组分的相对标准偏差(RSD, n=11)在0.53%~9.8%之间。采用实验方法对钾冰晶石生产样品中上述9种成分进行测定,结果与其他方法测定结果相符。  相似文献   

14.
Spectral analysis is a general modelling approach that enables calculation of parametric images from reconstructed tracer kinetic data independent of an assumed compartmental structure. We investigated the validity of applying spectral analysis directly to projection data motivated by the advantages that: (i) the number of reconstructions is reduced by an order of magnitude and (ii) iterative reconstruction becomes practical which may improve signal-to-noise ratio (SNR). A dynamic software phantom with typical 2-[11C]thymidine kinetics was used to compare projection-based and image-based methods and to assess bias-variance trade-offs using iterative expectation maximization (EM) reconstruction. We found that the two approaches are not exactly equivalent due to properties of the non-negative least-squares algorithm. However, the differences are small (< 5%) and mainly affect parameters related to early and late time points on the impulse response function (K1 and, to a lesser extent, VD). The optimal number of EM iteration was 15-30 with up to a two-fold improvement in SNR over filtered back projection. We conclude that projection-based spectral analysis with EM reconstruction yields accurate parametric images with high SNR and has potential application to a wide range of positron emission tomography ligands.  相似文献   

15.
Alteration of ligand binding to dopamine D2 receptors through activation of adenosine A2A receptors in rat striatal membranes has been studied by means of kinetic analysis. The binding of dopaminergic agonist [3H]quinpirole to rat striatal membranes was characterized by the constants Kd = 1.50+/-0.09 nM and Bmax = 115+/-2 fmol/mg of protein. The kinetic analyses revealed that the binding had at least two consecutive and kinetically distinguishable steps, the fast equilibrium of complex formation between receptor and agonist (KA = 5.9+/-1.7 nM), followed by a slow isomerization equilibrium (Ki = 0.06). Activation of adenosine A2A receptors by CGS 21680 caused enhancement of the rate [3H]quinpirole binding, altering mainly the formation of the receptor-ligand complexes (KA) as well as the isomerization rate of this complexes (ki), while the deisomerization rate (k[-i]) and the apparent dissociation rate remained unchanged.  相似文献   

16.
This work studies the frequency behavior of a least-square method to estimate the power spectral density of unevenly sampled signals. When the uneven sampling can be modeled as uniform sampling plus a stationary random deviation, this spectrum results in a periodic repetition of the original continuous time spectrum at the mean Nyquist frequency, with a low-pass effect affecting upper frequency bands that depends on the sampling dispersion. If the dispersion is small compared with the mean sampling period, the estimation at the base band is unbiased with practically no dispersion. When uneven sampling is modeled by a deterministic sinusoidal variation respect to the uniform sampling the obtained results are in agreement with those obtained for small random deviation. This approximation is usually well satisfied in signals like heart rate (HR) series. The theoretically predicted performance has been tested and corroborated with simulated and real HR signals. The Lomb method has been compared with the classical power spectral density (PSD) estimators that include resampling to get uniform sampling. We have found that the Lomb method avoids the major problem of classical methods: the low-pass effect of the resampling. Also only frequencies up to the mean Nyquist frequency should be considered (lower than 0.5 Hz if the HR is lower than 60 bpm). We conclude that for PSD estimation of unevenly sampled signals the Lomb method is more suitable than fast Fourier transform or autoregressive estimate with linear or cubic interpolation. In extreme situations (low-HR or high-frequency components) the Lomb estimate still introduces high-frequency contamination that suggest further studies of superior performance interpolators. In the case of HR signals we have also marked the convenience of selecting a stationary heart rate period to carry out a heart rate variability analysis.  相似文献   

17.
The jackknife and the bootstrap are two non parametric methods which provide estimates- of the bias and the variance of an estimator, without any assumption about its statistical distribution. The jackknife is based on the observation of the estimator for subsamples, generally of size n-1, obtained from the original sample. The bootstrap is based on the observation of the estimator on size n samples drawn from the original sample. The two methods are presented, their principle is illustrated through their application to simple examples and to more complex epidemiological problems.  相似文献   

18.
原子发射光谱作为多元素同步分析技术具有巨大的在线分析潜力。由于光谱数据量大,干扰信息与有效信息并存,不利于对光谱数据进行定性定量分析。小波分析具有分时分频精细表达和多尺度多分辨率分析的独特优势,本文介绍了小波变换去噪技术原理,通过对一组光谱数据的去噪处理,说明利用小波分析法可以有效的消减光谱中的干扰信息。  相似文献   

19.
A new procedure using stable isotope labelled serine (L-[2,3,3-d3] serine) and cysteine (L[13-3-13 C] cysteine) and analysis by gas chromatography/mass spectrometry (GC/MS) has been developed to measure transsulphuration in sheep. The enrichments of the tracers in plasma and skin biopsy samples were measured by GC/electron impact MS analysis of the t-butyldimethylsilyl derivatives. The measured recoveries of the standards enriched with [3-13 C] cysteine from 0.1% to 8%, or with [2,3,3-d3] serine from 0.14% to 14% were greater than 99% of the theoretical values, and the variation coefficients were less than 3% when the enrichment was higher than 0.5%. The use of dithiothreitold (DTT) as a reducing agent before deproteinization of the sample and during the derivatizations successfully increased the cysteine peak area and significantly improved reproducibility in the analysis. The cysteine residues in protein from the skin biopsy were also during the protein hydrolysis with DTT in 6 n HCI. The method was applied to measure transsulphuration of methionine in young sheep. The amount of cysteine derived from transsulphuration accounted for 17% to 21% of the irreversible loss rate of cysteine, depending on the substrate supplies. The results are consistent with other reports. Compared with conventional methods of measuring transsulphuration using radioactive isotopes, the processes of animal experimentation was sample analysis were simple, and there were no radiation hazards. The method should prove useful in studies on the metabolism of methionine and cysteine in human and animals.  相似文献   

20.
Open-channel flow simulations require values of friction parameters that are determined through formulating their inverse problems using a sample of historic events. However, there is a risk of the parameter values being biased toward the events used because of the sample size of historic events. Traditional sample-size determination approaches are known to suffer from a conflict between reliability and costs. In addition, there is no objective approach to determine a minimum size. These problems are solved through formulating the new “confidence calculation method,” which builds on the following: (1) the data points of a “parent sample” of friction parameters are resampled into “subsample populations,” where their “subsample means” contain latent information governed by the “central limit theorem”; (2) a rigorously derived formula makes it possible to calculate standard deviations of subsample populations, replacing direct resampling operations with mathematical ones; and (3) the method is parameterized for quantitatively trading off between sample sizes and reliability, minimizing the risk of the dependence of parameter values on their size. This paper illustrates the application of the method using synthetically generated samples of open-channel friction parameters.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号