首页 | 本学科首页   官方微博 | 高级检索  
     


Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
Authors:Wenbo Gao  Donald Goldfarb
Affiliation:Department of Industrial Engineering and Operations Research, Columbia University, New York, NY, USA
Abstract:We consider the use of a curvature-adaptive step size in gradient-based iterative methods, including quasi-Newton methods, for minimizing self-concordant functions, extending an approach first proposed for Newton's method by Nesterov. This step size has a simple expression that can be computed analytically; hence, line searches are not needed. We show that using this step size in the BFGS method (and quasi-Newton methods in the Broyden convex class other than the DFP method) results in superlinear convergence for strongly convex self-concordant functions. We present numerical experiments comparing gradient descent and BFGS methods using the curvature-adaptive step size to traditional methods on deterministic logistic regression problems, and to versions of stochastic gradient descent on stochastic optimization problems.
Keywords:Quasi-Newton methods  BFGS  self-concordant functions  damped Newton method
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号