首页 | 本学科首页   官方微博 | 高级检索  
     


Learning linear PCA with convex semi-definite programming
Authors:Qing Tao [Author Vitae]  Gao-wei Wu [Author Vitae]
Affiliation:a New Star Research Institute of Applied Tech, Hefei 230031, PR China
b Division of Intelligent Software Systems, Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100080, PR China
c Laboratory of Complex Systems and Intelligence Science, Institute of Automation, Chinese Academy of Sciences, Beijing 100080, PR China
Abstract:The aim of this paper is to learn a linear principal component using the nature of support vector machines (SVMs). To this end, a complete SVM-like framework of linear PCA (SVPCA) for deciding the projection direction is constructed, where new expected risk and margin are introduced. Within this framework, a new semi-definite programming problem for maximizing the margin is formulated and a new definition of support vectors is established. As a weighted case of regular PCA, our SVPCA coincides with the regular PCA if all the samples play the same part in data compression. Theoretical explanation indicates that SVPCA is based on a margin-based generalization bound and thus good prediction ability is ensured. Furthermore, the robust form of SVPCA with a interpretable parameter is achieved using the soft idea in SVMs. The great advantage lies in the fact that SVPCA is a learning algorithm without local minima because of the convexity of the semi-definite optimization problems. To validate the performance of SVPCA, several experiments are conducted and numerical results have demonstrated that their generalization ability is better than that of regular PCA. Finally, some existing problems are also discussed.
Keywords:Principal component analysis   Statistical learning theory   Support vector machines   Margin   Maximal margin algorithm   Semi-definite programming   Robustness
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号