首页 | 本学科首页   官方微博 | 高级检索  
     


A general theory of a class of linear neural nets for principal and minor component analysis
Authors:Kiyotoshi Matsuoka
Affiliation:(1) Department of Control Engineering, Kyushu Institute of Technology, Sensui 1-1, 804 Tobata, Kitakyushu, Japan
Abstract:This paper presents a unified theory of a class of learning neural nets for principal component analysis (PCA) and minor component analysis (MCA). First, some fundamental properties are addressed which all neural nets in the class have in common. Second, a subclass called the generalized asymmetric learning algorithm is investigated, and the kind of asymmetric structure which is required in general to obtain the individual eigenvectors of the correlation matrix of a data sequence is clarified. Third, focusing on a single-neuron model, a systematic way of deriving both PCA and MCA learning algorithms is shown, through which a relation between the normalization in PCA algorithms and that in MCA algorithms is revealed. This work was presented, in part, at the Third International Symposium on Artificial Life and Robotics, Oita, Japan, January 19–21, 1998
Keywords:Neural net  Principal component analysis  Minor component analysis
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号