首页 | 本学科首页   官方微博 | 高级检索  
     


Trimmed bagging
Authors:Christophe Croux  Kristel Joossens
Affiliation:a Faculty of Economics and Management & University Center of Statistics, K.U. Leuven, Naamsestraat 69, B3000 Leuven, Belgium
b School of Economics, Erasmus University Rotterdam, Burgemeester Oudlaan 50, P.O. Box 1738, 3000 DR Rotterdam, The Netherlands
Abstract:Bagging has been found to be successful in increasing the predictive performance of unstable classifiers. Bagging draws bootstrap samples from the training sample, applies the classifier to each bootstrap sample, and then averages over all obtained classification rules. The idea of trimmed bagging is to exclude the bootstrapped classification rules that yield the highest error rates, as estimated by the out-of-bag error rate, and to aggregate over the remaining ones. In this note we explore the potential benefits of trimmed bagging. On the basis of numerical experiments, we conclude that trimmed bagging performs comparably to standard bagging when applied to unstable classifiers as decision trees, but yields better results when applied to more stable base classifiers, like support vector machines.
Keywords:Aggregation  Bagging  Decision trees  Error rate  Support vector machine  Trimmed means
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号