A simplified multi-class support vector machine with reduced dual optimization |
| |
Authors: | Xisheng He Cheng JinYingbin Zheng Xiangyang Xue |
| |
Affiliation: | a Shanghai Key Laboratory of Intelligent Information Processing, School of Computer Science, Fudan University, Shanghai 200433, China b Department of Computer Science and Engineering, East China University of Science and Technology, Shanghai 200237, China |
| |
Abstract: | Support vector machine (SVM) was initially designed for binary classification. To extend SVM to the multi-class scenario, a number of classification models were proposed such as the one by Crammer and Singer (2001). However, the number of variables in Crammer and Singer’s dual problem is the product of the number of samples (l) by the number of classes (k), which produces a large computational complexity. This paper presents a simplified multi-class SVM (SimMSVM) that reduces the size of the resulting dual problem from l × k to l by introducing a relaxed classification error bound. The experimental results demonstrate that the proposed SimMSVM approach can greatly speed-up the training process, while maintaining a competitive classification accuracy. |
| |
Keywords: | Multi-class classification Support vector machine Kernel-based methods Pattern classification |
本文献已被 ScienceDirect 等数据库收录! |
|