首页 | 本学科首页   官方微博 | 高级检索  
     


Boosting Methods for Regression
Authors:Duffy  Nigel  Helmbold  David
Affiliation:(1) Computer Science Department, University of California, Santa Cruz, Santa Cruz, CA 95064, USA
Abstract:In this paper we examine ensemble methods for regression that leverage or ldquoboostrdquo base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its strong theoretical guarantees. We present several gradient descent leveraging algorithms for regression and prove AdaBoost-style bounds on their sample errors using intuitive assumptions on the base learners. We bound the complexity of the regression functions produced in order to derive PAC-style bounds on their generalization errors. Experiments validate our theoretical results.
Keywords:learning  boosting  arcing  ensemble methods  regression  gradient descent
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号