首页 | 本学科首页   官方微博 | 高级检索  
     

特征加权支持向量机
引用本文:汪廷华,田盛丰,黄厚宽.特征加权支持向量机[J].电子与信息学报,2009,31(3):514-518.
作者姓名:汪廷华  田盛丰  黄厚宽
作者单位:北京交通大学计算机与信息技术学院,北京,100044
基金项目:国家重点基础研究发展规划(973计划),北京交通大学科技基金 
摘    要:该文针对现有的加权支持向量机(WSVM)和模糊支持向量机(FSVM)只考虑样本重要性而没有考虑特征重要性对分类结果的影响的缺陷,提出了基于特征加权的支持向量机方法,即特征加权支持向量机(FWSVM).该方法首先利用信息增益计算各个特征对分类任务的重要度,然后用获得的特征重要度对核函数中的内积和欧氏距离进行加权计算,从而避免了核函数的计算被一些弱相关或不相关的特征所支配.理论分析和数值实验的结果都表明,该方法比传统的SVM具有更好的鲁棒性和分类能力.

关 键 词:支持向量机  特征加权  信息增益  机器学习
收稿时间:2007-10-31
修稿时间:2008-4-7

Feature Weighted Support Vector Machine
Wang Ting-hua,Tian Sheng-feng,Huang Hou-kuan.Feature Weighted Support Vector Machine[J].Journal of Electronics & Information Technology,2009,31(3):514-518.
Authors:Wang Ting-hua  Tian Sheng-feng  Huang Hou-kuan
Affiliation:School of Computer and Information Technology, Beijing Jiaotong University, Beijing 100044, China
Abstract:Support vector machine has been applied in many research fields, such as pattern recognition and function estimate. There is a shortcoming in Weighted SVM and Fuzzy SVM, which take the importance of sample into account but neglect the relative importance of each feature with respect to the classification task. In this paper a SVM approach is proposed based on the feature weighting, i.e. Feature Weighted SVM (FWSVM). This method first estimates the relative importance (weight) of each feature by computing the information gain. Then, it utilizes the weights for computing the inner product and Euclidean distance in kernel functions. In this way the computing of kernel function can avoid being dominated by trivial relevant or irrelevant features. Theoretical analysis and experimental results show that the FWSVM is more robust and has the better performance of generalization than the traditional SVM.
Keywords:Support Vector Machine (SVM)  Feature weighting  Information gain  Machine learning
本文献已被 CNKI 维普 万方数据 等数据库收录!
点击此处可从《电子与信息学报》浏览原始摘要信息
点击此处可从《电子与信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号