首页 | 本学科首页   官方微博 | 高级检索  
     

非饱和区扩展的RNN算法优化
引用本文:张尧,沈海斌.非饱和区扩展的RNN算法优化[J].传感器与微系统,2018(3):41-43.
作者姓名:张尧  沈海斌
作者单位:浙江大学超大规模集成电路设计研究所,浙江杭州,310027
基金项目:国家"863"计划资助项目
摘    要:针对长短时记忆网络(LSTM)型循环神经网络(RNN)收敛速度慢,提出了扩展激活函数非饱和区的RNN算法优化.针对LSTM型RNN的训练过程收敛速度慢的原因以及激活函数的性质,提出了加快RNN训练过程收敛的解决方法.通过字符级语言模型对优化方法进行了验证,结果表明:非饱和区扩展的RNN算法优化有效地加快了RNN训练过程的收敛.

关 键 词:循环神经网络  长短时记忆网络  激活函数  recurrent  neural  network  long  short-term  memory  activation  function

RNN algorithm optimization based on extended unsaturated region
ZHANG Yao,SHEN Hai-bin.RNN algorithm optimization based on extended unsaturated region[J].Transducer and Microsystem Technology,2018(3):41-43.
Authors:ZHANG Yao  SHEN Hai-bin
Abstract:Recurrent neural network(RNN),and specifically a variant with long short-term memory(LSTM),is a very efficient model for processing sequence data. RNN algorithm optimization based on extended unsaturated region is proposed aiming at slow convergence of LSTM RNN.Solution to speed up the RNN training process is proposed based on the reason for slow training process and properties of activation function.Optimization method is verified based on character-level language models.The experimental results show that optimization method makes training process fast.
Keywords:
本文献已被 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号