首页 | 本学科首页   官方微博 | 高级检索  
     


Learning the kernel matrix by maximizing a KFD-based class separability criterion
Authors:Dit-Yan Yeung [Author Vitae]  Hong Chang [Author Vitae] [Author Vitae]
Affiliation:a Department of Computer Science and Engineering, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong
b Xerox Research Centre Europe, 6 chemin de Maupertuis, 38240 Meylan, France
Abstract:The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods.
Keywords:Kernel learning  Fisher discriminant criterion  Kernel Fisher discriminant  Face recognition
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号