首页 | 本学科首页   官方微博 | 高级检索  
     


Functional gradient ascent for Probit regression
Authors:Songfeng Zheng  Weixiang Liu
Affiliation:1. Department of Mathematics, Missouri State University, 901 S. National Ave., Springfield, MO 65897, USA;2. Biomedical Engineering Lab, School of Medicine, Shenzhen University, Shenzhen, Guangdong 518060, China
Abstract:This paper proposes two gradient based methods to fit a Probit regression model by maximizing the sample log-likelihood function. Using the property of the Hessian of the objective function, the first method performs weighted least square regression in each iteration of the Newton–Raphson framework, resulting in ProbitBoost, a boosting-like algorithm. Motivated by the gradient boosting algorithm 10], the second proposed approach maximizes the sample log-likelihood function by updating the fitted function a small step in the gradient direction, performing gradient ascent in functional space, resulting in Gradient ProbitBoost. We also generalize the algorithms to multi-class problems by two strategies, one of which is to use the gradient ascent to maximize the multi-class sample log-likelihood function for fitting all the classifiers simultaneously, and the second approach uses the one-versus-all scheme to reduce the multi-class problem to a series of binary classification problems. The proposed algorithms are tested on typical classification problems including face detection, cancer classification, and handwritten digit recognition. The results show that compared to the alternative methods, the proposed algorithms perform similar or better in terms of testing error rates.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号