首页 | 本学科首页   官方微博 | 高级检索  
     

基于Boosting框架的非稀疏多核学习方法
引用本文:胡庆辉,李志远.基于Boosting框架的非稀疏多核学习方法[J].计算机应用研究,2016,33(11).
作者姓名:胡庆辉  李志远
作者单位:桂林航天工业学院,桂林航天工业学院信息工程系 广西 桂林
基金项目:国家自然科学基金资助项目(11301106);广西自然科学基金资助项目(2014GXNSFAA1183105);广西高校科研项目(ZD2014147,YB2014431);
摘    要:针对传统的分类器集成的每次迭代通常是将单个最优个体分类器集成到强分类器中,而其它可能有辅助作用的个体分类器被简单抛弃的问题,提出了一种基于Boosting框架的非稀疏多核学习方法MKL-Boost,利用了分类器集成学习的思想,每次迭代时,首先从训练集中选取一个训练子集,然后利用正则化非稀疏多核学习方法训练最优个体分类器,求得的个体分类器考虑了M个基本核的最优非稀疏线性凸组合,通过对核组合系数施加LP范数约束,一些好的核得以保留,从而保留了更多的有用特征信息,差的核将会被去掉,保证了有选择性的核融合,然后将基于核组合的最优个体分类器集成到强分类器中。提出的算法既具有Boosting集成学习的优点,同时具有正则化非稀疏多核学习的优点,实验表明,相对于其它Boosting算法,MKL-Boost可以在较少的迭代次数内获得较高的分类精度。

关 键 词:集成学习    非稀疏多核学习  弱分类器  基本核  
收稿时间:2015/7/25 0:00:00
修稿时间:2016/9/21 0:00:00

Non-sparse multiple kernel learning method based on Boosting framework
Hu Qinghui and Li Zhiyuan.Non-sparse multiple kernel learning method based on Boosting framework[J].Application Research of Computers,2016,33(11).
Authors:Hu Qinghui and Li Zhiyuan
Affiliation:State Key Laboratory of Software Engineering,Wuhan University,School of Information Engineering,Guilin University of Aerospace Technology
Abstract:Focus on the problem that the traditional classifier ensemble learning methods always ensemble a single optimal classifier into the strong one, and the others, which maybe useful to the optimal, were discarded simply in every boosting iteration. A non-sparse multiple kernel learning method was proposed which based on boosting framework. At every iteration, firstly, a subset was selected from the training dataset, then an optimal individual classifier is trained by regularized non-sparse multiple kernel learning method with this subset, which was obtained by optimizing the non-sparse combination of M basic kernels. Some good kernels were retained and the bad ones were discarded through imposing LP norm constrain on combination coefficients, and leaded to a selective kernel fusion and reserved more useful feature information. Lastly, these individual classifiers were combined into the strong one. The proposed method had the advantages of ensemble learning methods as well as that of regularized non-sparse multiple kernel learning methods. Experiments shows that it gains higher classification accuracy with smaller number of iterations compared with other boosting methods.
Keywords:ensemble learning  Non-sparse multiple kernel learning  classifier  
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号