A Lamarckian Hybrid of Differential Evolution and Conjugate Gradients for Neural Network Training |
| |
Authors: | Krzysztof Bandurski Wojciech Kwedlo |
| |
Affiliation: | (3) Dept. Pure and Applied Math. Washington State Univ., Pullman, Washington 99164-3113, USA;(4) Dept. Applied Math. Univ. Washington, Seattle, Washington 98195, USA |
| |
Abstract: | The paper describes two schemes that follow the model of Lamarckian evolution and combine differential evolution (DE), which is a population-based stochastic global search method, with the local optimization algorithm of conjugate gradients (CG). In the first, each offspring is fine-tuned by CG before competing with their parents. In the other CG is used to improve both parents and offspring in a manner that is completely seamless for individuals that survive more than one generation. Experiments involved training weights of feed-forward neural networks to solve three synthetic and four real-life problems. In six out of seven cases the DE–CG hybrid, which preserves and uses information on each solution’s local optimization process, outperformed two recent variants of DE. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|