首页 | 本学科首页   官方微博 | 高级检索  
     

融合渐近性的灰狼优化支持向量机模型
引用本文:武玉坤,肖杰,李伟,楼吉林.融合渐近性的灰狼优化支持向量机模型[J].计算机科学,2020,47(2):37-43.
作者姓名:武玉坤  肖杰  李伟  楼吉林
作者单位:浙江工业大学计算机科学与技术学院 杭州 310023;浙江工业大学计算机科学与技术学院 杭州 310023;浙江工业大学计算机科学与技术学院 杭州 310023;浙江工业大学计算机科学与技术学院 杭州 310023
基金项目:浙江省自然科学基金;国家自然科学基金
摘    要:大数据的发展对数据分类领域的分类准确性有了更高的要求;支持向量机(Support Vector Machine,SVM)的广泛应用需要一种高效的方法来构造一个分类能力强的SVM分类器;SVM的核函数参数与惩罚因子以及特征子集对预测模型的复杂度和预测精度有着重要影响。为提高SVM的分类性能,文中将SVM的渐近性融合到灰狼优化(Grey Wolf Optimization,GWO)算法中,提出了新的SVM分类器模型,该模型对SVM的参数与数据的特征子集同时进行优化,融合SVM渐近性的新灰狼个体将灰狼优化算法的搜索空间导向超参数空间中的最佳区域,能够更快地获得最优解;此外,将获得的分类准确率、所选特征个数和支持向量个数相结合,提出了一种新的适应度函数,新的适应度函数与融合渐近性的灰狼优化算法将搜索引向最优解。采用UCI中的多个经典数据集对所提模型进行验证,将其与网格搜素算法、未融合渐近性的灰狼优化算法以及其他文献中的方法进行对比,其分类准确率在不同数据集上均有不同程度的提升。实验结果表明,所提算法能找到SVM的最优参数与最小特征子集,具有更高的分类准确率和更短的平均处理时间。

关 键 词:灰狼优化算法  参数优化  特征选择  渐近性  支持向量机

Support Vector Machine Model Based on Grey Wolf Optimization Fused Asymptotic
WU Yu-kun,XIAO Jie,Wei William LEE,LOU Ji-lin.Support Vector Machine Model Based on Grey Wolf Optimization Fused Asymptotic[J].Computer Science,2020,47(2):37-43.
Authors:WU Yu-kun  XIAO Jie  Wei William LEE  LOU Ji-lin
Affiliation:(College of Computer Science&Technology,Zhejiang University of Technology,Hangzhou 310023,China)
Abstract:The development of big data requires higher accuracy of data classification.The wide application of support vector machine(SVM)requires an efficient method to construct an SVM classifier with strong classification ability.The kernel parameter,penalty parameter and feature subsets of dataset have an important impact on the complexity and prediction accuracy of the model.In order to improve the classification performance of SVM,the asymptotic of SVM was integrated into the gray wolf optimization(GWO)algorithm,and a new SVM classifier model was proposed.The model optimizes feature selection and parameter optimization of SVM at the same time.The new grey wolf individual integrating the asymptotic property of SVM directs the search space of grey wolf optimization algorithm to the optimal region in super-parameter space,and can obtain the optimal solution faster.In addition,a new fitness function,which combines the classification accuracy obtained from the method,the number of chosen features and the number of support vectors,was proposed.The new fitness function and GWO fused asymptotic lead the search to the optimal solution.This paper used several classical datasets on UCI to verify the proposed model.Compared with the grid search algorithm,the gray wolf optimization algorithm without asymptotic convergence and other methods in the literature,the classification accuracy of the proposed algorithm has different degrees of improvement on different data sets.The experimental results show that the proposed algorithm can find the optimal parameters and the smallest feature subset of SVM,with higher classification accuracy and less average processing time.
Keywords:Gray wolf optimization algorithm  Parameters optimization  Feature selection  Asymptotic  Support vector machines
本文献已被 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号