首页 | 本学科首页   官方微博 | 高级检索  
     


Efficient gradient descent method of RBF neural entworks with adaptive learning rate
Authors:Jiayu Lin  Ying Liu
Affiliation:School of Electro. Sci. and Tech., National Univ. of Defence Technology, Changsha 410073
Abstract:A new algorithm to exploit the learning rates of gradient descent method is presented, based on the second-order Taylor expansion of the error energy function with respect to learning rate, at some values decided by "award-punish" strategy. Detailed deduction of the algorithm applied to RBF networks is given. Simulation studies show that this algorithm can increase the rate of convergence and improve the performance of the gradient descent method.
Keywords:Gradient descent method  Learning rate   RBF neural networks
本文献已被 CNKI 维普 万方数据 SpringerLink 等数据库收录!
点击此处可从《电子科学学刊(英文版)》浏览原始摘要信息
点击此处可从《电子科学学刊(英文版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号