Accurate ensemble pruning with PL-bagging |
| |
Affiliation: | 1. Faculty of Science and Engineering, Åbo Akademi University, Turku, Finland;2. Faculty of Mathematics and Computer Science, FernUniversität in Hagen, Hagen, Germany;1. Multimedia, InfoRmation systems and Advanced Computing Laboratory, Sfax University, Tunis road Km 10, 3021 Sfax, Tunisia;2. Saudi Electronic University, Abu Bakr Street PO Box: 93499, 11673 Riyad, Kingdom of Saudi Arabia |
| |
Abstract: | Ensemble pruning deals with the selection of base learners prior to combination in order to improve prediction accuracy and efficiency. In the ensemble literature, it has been pointed out that in order for an ensemble classifier to achieve higher prediction accuracy, it is critical for the ensemble classifier to consist of accurate classifiers which at the same time diverse as much as possible. In this paper, a novel ensemble pruning method, called PL-bagging, is proposed. In order to attain the balance between diversity and accuracy of base learners, PL-bagging employs positive Lasso to assign weights to base learners in the combination step. Simulation studies and theoretical investigation showed that PL-bagging filters out redundant base learners while it assigns higher weights to more accurate base learners. Such improved weighting scheme of PL-bagging further results in higher classification accuracy and the improvement becomes even more significant as the ensemble size increases. The performance of PL-bagging was compared with state-of-the-art ensemble pruning methods for aggregation of bootstrapped base learners using 22 real and 4 synthetic datasets. The results indicate that PL-bagging significantly outperforms state-of-the-art ensemble pruning methods such as Boosting-based pruning and Trimmed bagging. |
| |
Keywords: | Ensemble Classification Lasso Bagging Boosting Random forest |
本文献已被 ScienceDirect 等数据库收录! |
|