首页 | 本学科首页   官方微博 | 高级检索  
     

一种放大误差信号的BP算法
引用本文:程玥,刘琼荪.一种放大误差信号的BP算法[J].计算机应用研究,2011,28(2):528-530.
作者姓名:程玥  刘琼荪
作者单位:重庆大学数学与统计学院,重庆,400030
摘    要:针对BP神经网络学习算法收敛速度慢、易陷入假饱和状态的问题,提出了一种快速收敛的BP算法。该算法通过修改激励函数的导数,放大误差信号来提高收敛性。给出了改进算法的收敛性分析并在实验仿真中将改进算法同时与标准BP算法和NG等人的改进算法进行比较。仿真结果表明,该算法在收敛速度方面大大优于另外两种算法,有效地提高了BP算法的全局收敛能力。

关 键 词:后向反馈算法    误差信号    收敛性

Back-propagation algorithm with magnified error signals
CHENG Yue,LIU Qiong-sun.Back-propagation algorithm with magnified error signals[J].Application Research of Computers,2011,28(2):528-530.
Authors:CHENG Yue  LIU Qiong-sun
Abstract:In view of the BP neural network existence convergence rate slow and easy to fall into local minimum, this paper presented a back-propagation (BP) algorithm with fast convergence. The new algorithm magnified the error signals by modi-fying the derivative of the activation function. This paper analyzed the convergence proof of the algorithm and compared the magnified algorithm with the standard BP algorithm and the improved algorithm of NG et al in the test. Simulation results show that the magnified algorithm can be more effective than the other two algorithms. It speeds up the convergence rate and enhances the global convergence capability.
Keywords:
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号