首页 | 本学科首页   官方微博 | 高级检索  
     

一种有效的储备池在线稀疏学习算法
引用本文:韩敏,王新迎.一种有效的储备池在线稀疏学习算法[J].自动化学报,2011,37(12):1536-1540.
作者姓名:韩敏  王新迎
作者单位:1.大连理工大学电子信息与电气工程学部 大连 116023
基金项目:国家自然科学基金(61074096)资助~~
摘    要:为克服传统储备池方法缺乏良好在线学习算法的问题, 同时考虑到储备池本身存在的不适定问题, 本文提出一种储备池在线稀疏学习算法, 对储备池目标函数施加L1正则化约束,并采用截断梯度算法在线近似求解.所提算法在对储备池输出权值进行在线调整的同时, 可对储备池输出权值的稀疏性进行有效控制, 有效保证了网络的泛化性能.理论分析和仿真实例证明所提算法的有效性.

关 键 词:递归网络    回声状态网络    稀疏    在线    优化
收稿时间:2011-2-24
修稿时间:2011-7-7

An Effective Online Sparse Learning Algorithm for Echo State Networks
HAN Min,WANG Xin-Ying.An Effective Online Sparse Learning Algorithm for Echo State Networks[J].Acta Automatica Sinica,2011,37(12):1536-1540.
Authors:HAN Min  WANG Xin-Ying
Affiliation:1.Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian 116023
Abstract:In order to overcome the lack of effective online learning method for echo state networks and to solve the ill-posed problem of reservoir, an effective online sparse learning algorithm is proposed for echo state networks in this paper. An L1 regularization constraint is added to the objective function of reservoir, and a truncated gradient algorithm is used to approximately solve the problem online. The proposed method can adjust the output weights of reservoir online, control the sparsity of the output weights, and ensure the generalization performance.Theoretical analysis and simulation results demonstrate the effectiveness of the algorithm.
Keywords:Recurrent neural networks  echo state networks (ESNs)  sparse  online  optimization
本文献已被 CNKI 等数据库收录!
点击此处可从《自动化学报》浏览原始摘要信息
点击此处可从《自动化学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号