首页 | 本学科首页   官方微博 | 高级检索  
     


Benchmarking Least Squares Support Vector Machine Classifiers
Authors:van Gestel  Tony  Suykens  Johan AK  Baesens  Bart  Viaene  Stijn  Vanthienen  Jan  Dedene  Guido  de Moor  Bart  Vandewalle  Joos
Affiliation:(1) Department of Electrical Engineering, ESAT/SISTA, Katholieke Universiteit Leuven, Belgium;(2) Leuven Institute for Research on Information Systems, Katholieke Universiteit Leuven, Belgium;(3) Department of Electrical Engineering, ESAT/SISTA, Katholieke Universiteit Leuven, Belgium
Abstract:In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization problems are represented by a set of binary classifiers using different output coding schemes. While regularization is used to control the effective number of parameters of the LS-SVM classifier, the sparseness property of SVMs is lost due to the choice of the 2-norm. Sparseness can be imposed in a second stage by gradually pruning the support value spectrum and optimizing the hyperparameters during the sparse approximation procedure. In this paper, twenty public domain benchmark datasets are used to evaluate the test set performance of LS-SVM classifiers with linear, polynomial and radial basis function (RBF) kernels. Both the SVM and LS-SVM classifier with RBF kernel in combination with standard cross-validation procedures for hyperparameter selection achieve comparable test set performances. These SVM and LS-SVM performances are consistently very good when compared to a variety of methods described in the literature including decision tree based algorithms, statistical algorithms and instance based learning methods. We show on ten UCI datasets that the LS-SVM sparse approximation procedure can be successfully applied.
Keywords:least squares support vector machines  multiclass support vector machines  sparse approximation
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号