首页 | 本学科首页   官方微博 | 高级检索  
     


Learning eigenfunctions links spectral embedding and kernel PCA
Authors:Bengio Yoshua  Delalleau Olivier  Le Roux Nicolas  Paiement Jean-François  Vincent Pascal  Ouimet Marie
Affiliation:Département d'Informatique et Recherche Opérationnelle, Centre de Recherches Mathématiques, Université de Montréal, Montréal, Québec, H3C 3J7, Canada. bengioy@iro.umontreal.ca
Abstract:In this letter, we show a direct relation between spectral embedding methods and kernel principal components analysis and how both are special cases of a more general learning problem: learning the principal eigenfunctions of an operator defined from a kernel and the unknown data-generating density. Whereas spectral embedding methods provided only coordinates for the training points, the analysis justifies a simple extension to out-of-sample examples (the Nystr?m formula) for multidimensional scaling (MDS), spectral clustering, Laplacian eigenmaps, locally linear embedding (LLE), and Isomap. The analysis provides, for all such spectral embedding methods, the definition of a loss function, whose empirical average is minimized by the traditional algorithms. The asymptotic expected value of that loss defines a generalization performance and clarifies what these algorithms are trying to learn. Experiments with LLE, Isomap, spectral clustering, and MDS show that this out-of-sample embedding formula generalizes well, with a level of error comparable to the effect of small perturbations of the training set on the embedding.
Keywords:
本文献已被 PubMed 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号