首页 | 本学科首页   官方微博 | 高级检索  
     


A modified Hopfield auto-associative memory with improved capacity
Authors:Gimenez-Martinez  V
Affiliation:Fac. de Inf., Univ. Politecnica de Madrid.
Abstract:This paper describes a new procedure to implement a recurrent neural network (RNN), based on a new approach to the well-known Hopfield autoassociative memory. In our approach a RNN is seen as a complete graph G and the learning mechanism is also based on Hebb's law, but with a very significant difference: the weights, which control the dynamics of the net, are obtained by coloring the graph G. Once the training is complete, the synaptic matrix of the net will be the weight matrix of the graph. Any one of these matrices will fulfil some spatial properties, for this reason they will be referred to as tetrahedral matrices. The geometrical properties of these tetrahedral matrices may be used for classifying the n-dimensional state-vector space in n classes. In the recall stage, a parameter vector is introduced, which is related with the capacity of the network. It may be shown that the bigger the value of the ith component of the parameter vector is, the lower the capacity of the i] class of the state-vector space becomes. Once the capacity has been controlled, a new set of parameters that uses the statistical deviation of the prototypes to compare them with those that appear as fixed points is introduced, eliminating thus a great number of parasitic fixed points.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号