Fast Learning Algorithms for Feedforward Neural Networks |
| |
Authors: | Minghu Jiang Georges Gielen Bo Zhang Zhensheng Luo |
| |
Affiliation: | (1) Department of Electrical Eng., MICAS, Catholic University of Leuven, Kasteelpark Arenberg 10, B3001 Heverlee, Belgium;(2) Lab of Computational Linguistics, Department of Chinese Language, Tsinghua University, Beijing, 100084, People's Republic of China;(3) State Key Lab of Intelligent Tech. & Systems, Department of Computer, Tsinghua University, Beijing, 100084, People's Republic of China |
| |
Abstract: | In order to improve the training speed of multilayer feedforward neural networks (MLFNN), we propose and explore two new fast backpropagation (BP) algorithms obtained: (1) by changing the error functions, in case using the exponent attenuation (or bell impulse) function and the Fourier kernel function as alternative functions; and (2) by introducing the hybrid conjugate-gradient algorithm of global optimization for dynamic learning rate to overcome the conventional BP learning problems of getting stuck into local minima or slow convergence. Our experimental results demonstrate the effectiveness of the modified error functions since the training speed is faster than that of existing fast methods. In addition, our hybrid algorithm has a higher recognition rate than the Polak-Ribieve conjugate gradient and conventional BP algorithms, and has less training time, less complication and stronger robustness than the Fletcher-Reeves conjugate-gradient and conventional BP algorithms for real speech data. |
| |
Keywords: | fast algorithm error function conjugate gradient global convergence feedforward neural networks |
本文献已被 SpringerLink 等数据库收录! |
|