首页 | 本学科首页   官方微博 | 高级检索  
     


A new k-harmonic nearest neighbor classifier based on the multi-local means
Affiliation:1. School of Computer Science and Engineering, Northeastern University, Liaoning, China;2. Department of Information Management, School of Management, Shanghai University, Shanghai, China;2. Telematics Engineering Department, University of Cauca, Sector Tulcán, Popayán, Colombia;3. System Engineering Department, University of Cauca, Sector Tulcán, Popayán, Colombia;4. Institute of Informatics, Federal University of Rio Grande do Sul (UFRGS), Porto Alegre, Brazil;5. Intelligent Management Systems Group, Foundation University of Popayán, Colombia\n;1. SIANI, Univesidad de Las Palmas de Gran Canaria, Spain;2. Sapienza University of Rome, Italy;3. University of Salerno, Fisciano (SA), Italy;4. University of Naples Federico II, Italy
Abstract:The k-nearest neighbor (KNN) rule is a classical and yet very effective nonparametric technique in pattern classification, but its classification performance severely relies on the outliers. The local mean-based k-nearest neighbor classifier (LMKNN) was firstly introduced to achieve robustness against outliers by computing the local mean vector of k nearest neighbors for each class. However, its performances suffer from the choice of the single value of k for each class and the uniform value of k for different classes. In this paper, we propose a new KNN-based classifier, called multi-local means-based k-harmonic nearest neighbor (MLM-KHNN) rule. In our method, the k nearest neighbors in each class are first found, and then used to compute k different local mean vectors, which are employed to compute their harmonic mean distance to the query sample. Finally, MLM-KHNN proceeds in classifying the query sample to the class with the minimum harmonic mean distance. The experimental results, based on twenty real-world datasets from UCI and KEEL repository, demonstrated that the proposed MLM-KHNN classifier achieves lower classification error rate and is less sensitive to the parameter k, when compared to nine related competitive KNN-based classifiers, especially in small training sample size situations.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号