首页 | 本学科首页   官方微博 | 高级检索  
     


Manifold Based Local Classifiers: Linear and Nonlinear Approaches
Authors:Hakan Cevikalp  Diane Larlus  Marian Neamtu  Bill Triggs  Frederic Jurie
Affiliation:(1) Electrical and Electronics Engineering Department, Eskisehir Osmangazi University, Meselik, 26480 Eskisehir, Turkey;(2) Learning and Recognition in Vision (LEAR), INRIA, Grenoble, France;(3) Department of Mathematics, Vanderbilt University, Nashville, TN, USA;(4) Laboratoire Jean Kuntzmann, Grenoble, France;(5) University of Caen, Caen, France
Abstract:In case of insufficient data samples in high-dimensional classification problems, sparse scatters of samples tend to have many ‘holes’—regions that have few or no nearby training samples from the class. When such regions lie close to inter-class boundaries, the nearest neighbors of a query may lie in the wrong class, thus leading to errors in the Nearest Neighbor classification rule. The K-local hyperplane distance nearest neighbor (HKNN) algorithm tackles this problem by approximating each class with a smooth nonlinear manifold, which is considered to be locally linear. The method takes advantage of the local linearity assumption by using the distances from a query sample to the affine hulls of query’s nearest neighbors for decision making. However, HKNN is limited to using the Euclidean distance metric, which is a significant limitation in practice. In this paper we reformulate HKNN in terms of subspaces, and propose a variant, the Local Discriminative Common Vector (LDCV) method, that is more suitable for classification tasks where the classes have similar intra-class variations. We then extend both methods to the nonlinear case by mapping the nearest neighbors into a higher-dimensional space where the linear manifolds are constructed. This procedure allows us to use a wide variety of distance functions in the process, while computing distances between the query sample and the nonlinear manifolds remains straightforward owing to the linear nature of the manifolds in the mapped space. We tested the proposed methods on several classification tasks, obtaining better results than both the Support Vector Machines (SVMs) and their local counterpart SVM-KNN on the USPS and Image segmentation databases, and outperforming the local SVM-KNN on the Caltech visual recognition database.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号