首页 | 本学科首页   官方微博 | 高级检索  
     


Gradient descent learning rule for complex‐valued associative memories with large constant terms
Authors:Masaki Kobayashi
Abstract:Complex‐valued associative memories (CAMs) are one of the most promising associative memory models by neural networks. However, the low noise tolerance of CAMs is often a serious problem. A projection learning rule with large constant terms improves the noise tolerance of CAMs. However, the projection learning rule can be applied only to CAMs with full connections. In this paper, we propose a gradient descent learning rule with large constant terms, which is not restricted by network topology. We realize large constant terms by regularization to connection weights. By computer simulations, we prove that the proposed learning algorithm improves noise tolerance. © 2016 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.
Keywords:complex‐valued neural networks  associative memory  noise tolerance  learning algorithm
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号