An efficient algorithm for Kernel two-dimensional principal component analysis |
| |
Authors: | Ning Sun Hai-xian Wang Zhen-hai Ji Cai-rong Zou Li Zhao |
| |
Affiliation: | (1) Research Center of Learning Science, Southeast University, Nanjing, 210096, China;(2) Department of Radio Engineering, Southeast University, Nanjing, 210096, China |
| |
Abstract: | Recently, a new approach called two-dimensional principal component analysis (2DPCA) has been proposed for face representation
and recognition. The essence of 2DPCA is that it computes the eigenvectors of the so-called image covariance matrix without
matrix-to-vector conversion. Kernel principal component analysis (KPCA) is a non-linear generation of the popular principal
component analysis via the Kernel trick. Similarly, the Kernelization of 2DPCA can be benefit to develop the non-linear structures
in the input data. However, the standard K2DPCA always suffers from the computational problem for using the image matrix directly.
In this paper, we propose an efficient algorithm to speed up the training procedure of K2DPCA. The results of experiments
on face recognition show that the proposed algorithm can achieve much more computational efficiency and remarkably save the
memory-consuming compared to the standard K2DPCA. |
| |
Keywords: | Eigenvalues decomposition Feature extraction KPCA K2DPCA |
本文献已被 SpringerLink 等数据库收录! |
|