首页 | 本学科首页   官方微博 | 高级检索  
     


Maxi-min margin machine: learning large margin classifiers locally and globally.
Authors:K Huang  H Yang  I King  M R Lyu
Affiliation:Fujitsu Research and Development Center Co. Ltd., Beijing, China. kzhuang@cn.fujitsu.com
Abstract:In this paper, we propose a novel large margin classifier, called the maxi-min margin machine M(4). This model learns the decision boundary both locally and globally. In comparison, other large margin classifiers construct separating hyperplanes only either locally or globally. For example, a state-of-the-art large margin classifier, the support vector machine (SVM), considers data only locally, while another significant model, the minimax probability machine (MPM), focuses on building the decision hyperplane exclusively based on the global information. As a major contribution, we show that SVM yields the same solution as M(4) when data satisfy certain conditions, and MPM can be regarded as a relaxation model of M(4). Moreover, based on our proposed local and global view of data, another popular model, the linear discriminant analysis, can easily be interpreted and extended as well. We describe the M(4) model definition, provide a geometrical interpretation, present theoretical justifications, and propose a practical sequential conic programming method to solve the optimization problem. We also show how to exploit Mercer kernels to extend M(4) for nonlinear classifications. Furthermore, we perform a series of evaluations on both synthetic data sets and real-world benchmark data sets. Comparison with SVM and MPM demonstrates the advantages of our new model.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号