Improved k-nearest neighbor classification |
| |
Authors: | Yingquan Wu Krassimir IanakievVenu Govindaraju |
| |
Affiliation: | Center of Excellence for Pattern Analysis and Recognition, State University of New York at Buffalo, Buffalo, NY 14228, USA |
| |
Abstract: | k-nearest neighbor (k-NN) classification is a well-known decision rule that is widely used in pattern classification. However, the traditional implementation of this method is computationally expensive. In this paper we develop two effective techniques, namely, template condensing and preprocessing, to significantly speed up k-NN classification while maintaining the level of accuracy. Our template condensing technique aims at “sparsifying” dense homogeneous clusters of prototypes of any single class. This is implemented by iteratively eliminating patterns which exhibit high attractive capacities. Our preprocessing technique filters a large portion of prototypes which are unlikely to match against the unknown pattern. This again accelerates the classification procedure considerably, especially in cases where the dimensionality of the feature space is high. One of our case studies shows that the incorporation of these two techniques to k-NN rule achieves a seven-fold speed-up without sacrificing accuracy. |
| |
Keywords: | k-Nearest neighbor classification Pattern classification Classifier Template condensing Preprocessing |
本文献已被 ScienceDirect 等数据库收录! |
|