Model selection for least squares support vector regressions based on small-world strategy |
| |
Authors: | Wentao Mao Guirong Yan Longlei Dong Dike Hu |
| |
Affiliation: | 1. Department of Science and Technology Teaching, China University of Political Science and Law, Beijing 100088, China;2. College of Information Science and Engineering, Guangxi University for Nationalities, Nanning 530006, China;3. Key Laboratory of Guangxi High Schools Complex System and Computational Intelligence, Nanning 530006, China;1. Dipartimento di Scienze Aziendali – Management & Innovation Systems (DISA-MIS), Università degli Studi di Salerno, via Giovanni Paolo II, 132, Fisciano 84084, Italy;2. Department of Electrical Computer Engineering, University of Alberta, Edmonton, AB T6R 2V4, Canada;1. University of Hildesheim, Universitätsplatz 1, 31141 Hildesheim, Germany;2. University of Eichstätt, Ingolstadt, Germany |
| |
Abstract: | Model selection plays a key role in the application of support vector machine (SVM). In this paper, a method of model selection based on the small-world strategy is proposed for least squares support vector regression (LS-SVR). In this method, the model selection is treated as a single-objective global optimization problem in which generalization performance measure performs as fitness function. To get better optimization performance, the main idea of depending more heavily on dense local connections in small-world phenomenon is considered, and a new small-world optimization algorithm based on tabu search, called the tabu-based small-world optimization (TSWO), is proposed by employing tabu search to construct local search operator. Therefore, the hyper-parameters with best generalization performance can be chosen as the global optimum based on the powerful search ability of TSWO. Experiments on six complex multimodal functions are conducted, demonstrating that TSWO performs better in avoiding premature of the population in comparison with the genetic algorithm (GA) and particle swarm optimization (PSO). Moreover, the effectiveness of leave-one-out bound of LS-SVM on regression problems is tested on noisy sinc function and benchmark data sets, and the numerical results show that the model selection using TSWO can almost obtain smaller generalization errors than using GA and PSO with three generalization performance measures adopted. |
| |
Keywords: | |
本文献已被 ScienceDirect 等数据库收录! |
|