首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   69篇
  免费   1篇
  国内免费   2篇
电工技术   1篇
综合类   1篇
化学工业   1篇
金属工艺   1篇
机械仪表   1篇
建筑科学   2篇
矿业工程   2篇
能源动力   1篇
无线电   1篇
一般工业技术   13篇
自动化技术   48篇
  2022年   2篇
  2021年   3篇
  2017年   3篇
  2016年   3篇
  2015年   2篇
  2014年   3篇
  2013年   12篇
  2012年   1篇
  2011年   5篇
  2010年   7篇
  2009年   6篇
  2008年   2篇
  2007年   7篇
  2006年   2篇
  2005年   4篇
  2004年   1篇
  2003年   2篇
  1997年   1篇
  1996年   2篇
  1993年   1篇
  1992年   2篇
  1985年   1篇
排序方式: 共有72条查询结果,搜索用时 15 毫秒
1.
When the substrate of a 2 nq factorial or fractional factorial experiment may be expected to show trends representable by linear and quadratic terms in time, then certain orderings spaced at equal time-intervals permit better estimation of the effects than do others. Some of these ordered plans are given for pq = 2, 3, 4, 5. Simple methods are given for computing effects, trends, and efficiencies.  相似文献   
2.
I extend the concept of partial least squares (PLS) into the framework of generalized linear models. A spectroscopy example in a logistic regression framework illustrates the developments. These models form a sequence of rank 1 approximations useful for predicting the response variable when the explanatory information is severely ill-conditioned. Iteratively reweighted PLS algorithms are presented with various theoretical properties. Connections to principal-component and maximum likelihood estimation are made, as well as suggestions for rules to choose the proper rank of the final model.  相似文献   
3.
4.
5.
网络异常检测技术是入侵检测系统中不可或缺的部分。然而目前的入侵检测系统普遍存在检测率不高,误报率过高等问题,从而难以在实际的企业中大规模采用。针对之前的检测技术检测效果不佳的问题,提出基于SVM回归和改进D-S证据理论的入侵检测方法。该方法是将支持向量机回归的分类融合应用到网络异常行为分析中,在SVM参数选择时采用交叉验证和深度优先搜索算法进行优化选择,并通过融合证据理论,建立网络异常检测模型。通过仿真实验表明,该模型能够有效地提高入侵检测性能,缩短检测时间。  相似文献   
6.
The k-nearest-neighbour (kNN) algorithm is widely applied for the estimation of forest attributes using remote sensing data. It requires a large amount of reference data to achieve satisfactory results. Usually, the number of available reference plots for the kNN-prediction is limited by the size of the area covered by a terrestrial reference inventory and remotely sensed imagery collected from one overflight. The applicability of kNN could be enhanced if adjacent images of different acquisition dates could be used in the same estimation procedure. Relative radiometric calibration is a prerequisite for this. This study focuses on two empirical calibration methods. They are tested on adjacent LANDSAT TM scenes in Austria. The first, quite conventional one is based on radiometric control points in the overlap area of two images and on the determination of transformation parameters by linear regression. The other, recently developed method exploits the kNN-cross-validation procedure. Performance and applicability of both methods as well as the impact of phenology are discussed.  相似文献   
7.
Technical Note: Selecting a Classification Method by Cross-Validation   总被引:3,自引:0,他引:3  
Schaffer  Cullen 《Machine Learning》1993,13(1):135-143
If we lack relevant problem-specific knowledge, cross-validation methods may be used to select a classification method empirically. We examine this idea here to show in what senses cross-validation does and does not solve the selection problem. As illustrated empirically, cross-validation may lead to higher average performance than application of any single classification strategy, and it also cuts the risk of poor performance. On the other hand, cross-validation is no more or less a form of bias than simpler strategies, and applying it appropriately ultimately depends in the same way on prior knowledge. In fact, cross-validation may be seen as a way of applying partial information about the applicability of alternative classification strategies.  相似文献   
8.
The performance improvements that can be achieved by classifier selection and by integrating terrain attributes into land cover classification are investigated in the context of rock glacier detection. While exposed glacier ice can easily be mapped from multispectral remote-sensing data, the detection of rock glaciers and debris-covered glaciers is a challenge for multispectral remote sensing. Motivated by the successful use of digital terrain analysis in rock glacier distribution models, the predictive performance of a combination of terrain attributes derived from SRTM (Shuttle Radar Topography Mission) digital elevation models and Landsat ETM+ data for detecting rock glaciers in the San Juan Mountains, Colorado, USA, is assessed. Eleven statistical and machine-learning techniques are compared in a benchmarking exercise, including logistic regression, generalized additive models (GAM), linear discriminant techniques, the support vector machine, and bootstrap-aggregated tree-based classifiers such as random forests. Penalized linear discriminant analysis (PLDA) yields mapping results that are significantly better than all other classifiers, achieving a median false-positive rate (mFPR, estimated by cross-validation) of 8.2% at a sensitivity of 70%, i.e. when 70% of all true rock glacier points are detected. The GAM and standard linear discriminant analysis were second best (mFPR: 8.8%), followed by polyclass. For comparison, the predictive performance of the best three techniques is also evaluated using (1) only terrain attributes as predictors (mFPR: 13.1-14.5% for best three techniques), and (2), only Landsat ETM+ data (mFPR: 19.4-22.7%), yielding significantly higher mFPR estimates at a 70% sensitivity. The mFPR of the worst three classifiers was by about one-quarter higher compared to the best three classifiers, and the combination of terrain attributes and multispectral data reduced the mFPR by more than one-half compared to remote sensing only. These results highlight the importance of combining remote-sensing and terrain data for mapping rock glaciers and other debris-covered ice and choosing the optimal classifier based on unbiased error estimators. The proposed benchmarking methodology is more generally suitable for comparing the utility of remote-sensing algorithms and sensors.  相似文献   
9.
Cross-validation (CV) is a very popular technique for model selection and model validation. The general procedure of leave-one-out CV (LOO-CV) is to exclude one observation from the data set, to construct the fit of the remaining observations and to evaluate that fit on the item that was left out. In classical procedures such as least-squares regression or kernel density estimation, easy formulas can be derived to compute this CV fit or the residuals of the removed observations. However, when high-breakdown resampling algorithms are used, it is no longer possible to derive such closed-form expressions. High-breakdown methods are developed to obtain estimates that can withstand the effects of outlying observations. Fast algorithms are presented for LOO-CV when using a high-breakdown method based on resampling, in the context of robust covariance estimation by means of the MCD estimator and robust principal component analysis. A robust PRESS curve is introduced as an exploratory tool to select the number of principal components. Simulation results and applications on real data show the accuracy and the gain in computation time of these fast CV algorithms.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号