首页 | 本学科首页   官方微博 | 高级检索  
     

基于不同惩罚系数的SMO改进算法
引用本文:田大东,邓伟.基于不同惩罚系数的SMO改进算法[J].计算机应用,2008,28(9):2369-2370.
作者姓名:田大东  邓伟
作者单位:苏州大学,计算机科学与技术学院,江苏,苏州,215006
摘    要:为了解决Keerthi改进的序贯最小优化(SMO)算法在处理非平衡数据集时,整体分类性能较低、稳定性差等问题,对两个类别施加不同的惩罚系数的方法对算法作进一步改进,同时给出计算公式及算法步骤。实验结果表明,该算法不但提高了处理非平衡数据集的能力,也进一步提高了其稳定性。

关 键 词:非平衡数据集  惩罚系数  序贯最小优化
收稿时间:2008-03-11
修稿时间:2008-04-24

Improved SMO algorithm with different error costs
TIAN Da-dong,DENG Wei.Improved SMO algorithm with different error costs[J].journal of Computer Applications,2008,28(9):2369-2370.
Authors:TIAN Da-dong  DENG Wei
Affiliation:TIAN Da-dong,DENG Wei(School of Computer Science , Technology,Soochow University,Suzhou Jiangsu 215006,China)
Abstract:When Keerthi's Sequential Minimal Optimization(SMO) algorithm is applied to the classification of unbalanced datasets,it not only leads to a poor classification performance but makes the result unstable.In order to overcome the difficulty,an improved SMO algorithm that used different error costs for different class was presented.Besides,the formula and the steps of the improved SMO algorithm were given.Experimental results show that our algorithm's ability of dealing with unbalanced datasets can be improved...
Keywords:unbalanced datasets  error costs  Sequential Minimal Optimization(SMO)  
本文献已被 CNKI 维普 万方数据 等数据库收录!
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号