首页 | 本学科首页   官方微博 | 高级检索  
     


Linear dimensionality reduction by maximizing the Chernoff distance in the transformed space
Authors:Luis Rueda  Myriam Herrera
Affiliation:1. Department of Computer Science, University of Concepción, Edmundo Larenas 215, Concepción 4030000, Chile;2. Institute of Informatics, National University of San Juan, Cereceto y Meglioli, San Juan 5400, Argentina;1. Department of Economics and Finance, Montclair State University, Montclair, NJ 07043, United States;2. Department of Economics, Cornell University, Ithaca, NY 14853, United States;1. Center for Nanoscale Science and Engineering, University of California – Riverside, Riverside, CA 92521, USA;2. Department of Chemistry, University of California – Riverside, Riverside, CA 92521, USA;3. Department of Physics, King Abdulaziz University, Jeddah 21589, Saudi Arabia;4. Imaging and Characterization Core Lab and PSE, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia;5. Physics Department, Faculty of Science, University of Hail, Saudi Arabia
Abstract:Linear dimensionality reduction (LDR) techniques are quite important in pattern recognition due to their linear time complexity and simplicity. In this paper, we present a novel LDR technique which, though linear, aims to maximize the Chernoff distance in the transformed space; thus, augmenting the class separability in such a space. We present the corresponding criterion, which is maximized via a gradient-based algorithm, and provide convergence and initialization proofs. We have performed a comprehensive performance analysis of our method combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data, and compared it with other LDR techniques. The results on synthetic and standard real-life data sets show that the proposed criterion outperforms the latter when combined with both linear and quadratic classifiers.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号