首页 | 本学科首页   官方微博 | 高级检索  
     

基于随机子空间和AdaBoost的自适应集成方法
引用本文:姚旭,王晓丹,张玉玺,邢雅琼.基于随机子空间和AdaBoost的自适应集成方法[J].电子学报,2013,41(4):810-814.
作者姓名:姚旭  王晓丹  张玉玺  邢雅琼
作者单位:空军工程大学防空反导学院,陕西西安 710051
摘    要:如何构造差异性大且精确度高的基分类器是集成学习的重点,为此提出一种新的集成学习方法——利用PSO寻找使得AdaBoost依样本权重抽取的数据集分类错误率最小化的最优特征权重分布,依据此最优权重分布对特征随机抽样生成随机子空间,并应用于AdaBoost的训练过程中.这就在增加分类器间差异性的同时保证了基分类器的准确度.最后用多数投票法融合各基分类器的决策结果,并通过仿真实验验证该方法的有效性.

关 键 词:集成学习  随机子空间  AdaBoost算法  粒子群优化  
收稿时间:2012-05-28

A Self-Adaption Ensemble Algorithm Based on Random Subspace and AdaBoost
YAO Xu , WANG Xiao-dan , ZHANG Yu-xi , XING Ya-qiong.A Self-Adaption Ensemble Algorithm Based on Random Subspace and AdaBoost[J].Acta Electronica Sinica,2013,41(4):810-814.
Authors:YAO Xu  WANG Xiao-dan  ZHANG Yu-xi  XING Ya-qiong
Affiliation:School of Air and Missile Defense, Air Force Engineering University, Xi'an, Shaanxi 710051, China
Abstract:It is an open issue how to generate base classifiers with higher diversity and accuracy for ensemble learning.In this paper,a novel algorithm is proposed to solve this problem---particle swarm optimization is used to search for an optimal feature weight distribution which makes the classification error rate of training data sample by the distribution in AdaBoost minimal.Then,the feature subspace is constructed according to the optimal feature weight distribution,which is applied into the training process of AdaBoost.Thus,the accuracy of base classifier is advanced;meanwhile,the diversity between classifiers is improved.Finally,majority voting method is utilized to fuse the base classifiers' results and experiments have been done to attest the validity of the proposed algorithm.
Keywords:ensemble learning  random subspace  AdaBoost algorithm  particle swarm optimization
本文献已被 万方数据 等数据库收录!
点击此处可从《电子学报》浏览原始摘要信息
点击此处可从《电子学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号