首页 | 本学科首页   官方微博 | 高级检索  
     


Structure minimization using the impact factor in neural networks
Authors:Email author" target="_blank">Kap-Ho?SeoEmail author  Jae-Su?Song  Ju-Jang?Lee
Affiliation:(1) Department of Electrical Engineering and Computer Science, Korea Advanced Institute of Science and Technology, 373-1 Guseong-Dong Yuseong-Gu, 305-701 Daejon, Korea
Abstract:Despite many advances, the problem of determining the proper size of a neural network is important, especially for its practical implications in such issues as learning and generalization. Unfortunately, it is not usually obvious which size is best; a system that is too small will not be able to learn the data, while one that is just big enough may learn very slowly and be very sensitive to initial conditions and learning parameters. There are two types of approach to determining the network size: pruning and growing. Pruning consists of training a network which is larger than necessary, and then removing unnecessary weights/nodes. Here, a new pruning method is developed, based on the penalty-term method. This method makes the neural networks good for generalization, and reduces the retraining time needed after pruning weights/nodes. This work was presented, in part, at the 6th International Symposium on Artificial Life and Robotics, Tokyo, Japan, January 15–17, 2001.
Keywords:Neural networks  Structure optimization  Pruning  Impact factor
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号