首页 | 本学科首页   官方微博 | 高级检索  
     


A Winner-Take-All Neural Networks of N Linear Threshold Neurons without Self-Excitatory Connections
Authors:Hong Qu  Zhang Yi  Xiaobin Wang
Affiliation:(1) School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, 610054, People’s Republic of China
Abstract:Multistable neural networks have attracted much interests in recent years, since the monostable networks are computationally restricted. This paper studies a N linear threshold neurons recurrent networks without Self-Excitatory connections. Our studies show that this network performs a Winner-Take-All (WTA) behavior, which has been recognized as a basic computational model done in brain. The contributions of this paper are: (1) It proves by mathematics that the proposed model is Non-Divergent. (2) An important implication (Winner-Take-All) of the proposed network model is studied. (3) Digital computer simulations are carried out to validate the performance of the theory findings. This work was supported by Specialized Research Fund for the Doctoral Program of Higher Education of China under Grant 200806141049.
Keywords:Winner-Take-All  Recurrent neural networks  Multistable  Linear threshold neurons
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号