首页 | 本学科首页   官方微博 | 高级检索  
     

基于FP-Tree 的快速选择性集成算法
引用本文:赵强利,蒋艳凰,徐明. 基于FP-Tree 的快速选择性集成算法[J]. 软件学报, 2011, 22(4): 709-721. DOI: 10.3724/SP.J.1001.2011.03752
作者姓名:赵强利  蒋艳凰  徐明
作者单位:国防科学技术大学,计算机学院,湖南,长沙,410073
基金项目:国家自然科学基金(60773017, 60905032)
摘    要:选择性集成通过选择部分基分类器参与集成,从而提高集成分类器的泛化能力,降低预测开销.但已有的选择性集成算法普遍耗时较长,将数据挖掘的技术应用于选择性集成,提出一种基于FP-Tree(frequent pattern tree)的快速选择性集成算法:CPM-EP(coverage based pattern mining for ensemble pruning).该算法将基分类器对校验样本集的分类结果组织成一个事务数据库,从而使选择性集成问题可转化为对事务数据集的处理问题.针对所有可能的集成分类器大小,CPM-EP算法首先得到一个精简的事务数据库,并创建一棵FP-Tree树保存其内容;然后,基于该FP-Tree获得相应大小的集成分类器.在获得的所有集成分类器中,对校验样本集预测精度最高的集成分类器即为算法的输出.实验结果表明,CPM-EP算法以很低的计算开销获得优越的泛化能力,其分类器选择时间约为GASEN的1/19以及Forward-Selection的1/8,其泛化能力显著优于参与比较的其他方法,而且产生的集成分类器具有较少的基分类器.

关 键 词:集成学习  选择性集成  频繁模式树  Bagging  误差反向传播神经网络
收稿时间:2009-02-14
修稿时间:2009-10-19

Fast Ensemble Pruning Algorithm Based on FP-Tree
ZHAO Qiang-Li,JIANG Yan-Huang and XU Ming. Fast Ensemble Pruning Algorithm Based on FP-Tree[J]. Journal of Software, 2011, 22(4): 709-721. DOI: 10.3724/SP.J.1001.2011.03752
Authors:ZHAO Qiang-Li  JIANG Yan-Huang  XU Ming
Affiliation:School of Computer, National University of Defense Technology, Changsha 410073, China;School of Computer, National University of Defense Technology, Changsha 410073, China;School of Computer, National University of Defense Technology, Changsha 410073, China
Abstract:By selecting parts of base classifiers to combine, ensemble pruning aims to achieve a better generalization and have less prediction time than the ensemble of all base classifiers. While, most of the ensemble pruning algorithms in literature consume much time for classifiers selection. This paper presents a fast ensemble pruning approach: CPM-EP (coverage based pattern mining for ensemble pruning). The algorithm converts an ensemble pruning task into a transaction database process, where the prediction results of all base classifiers for the validation set are organized as a transaction database. For each possible size k, CPM-EP obtains a refined transaction database and builds a FP-Tree to compact it. Next, CPM-EP selects an ensemble of size k. Among the obtained ensembles of all different sizes, the one with the best predictive accuracy for the validation set is output. Experimental results show that CPM-EP reduces computational overhead considerably. The selection time of CPM-EP is about 1/19 that of GASEN and 1/8 that of Forward Selection. Additionally, this approach achieves the best generalization, and the size of the pruned result is small.
Keywords:ensemble learning   ensemble pruning   frequent pattern tree (FP-tree)   bootstrap aggregation(Bagging)   back-propagation neural network (BPNN)
本文献已被 CNKI 万方数据 等数据库收录!
点击此处可从《软件学报》浏览原始摘要信息
点击此处可从《软件学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号