首页 | 本学科首页   官方微博 | 高级检索  
     


Stochastic extra-gradient based alternating direction methods for graph-guided regularized minimization
Authors:Qiang Lan  Lin-bo Qiao  Yi-jie Wang
Affiliation:1.College of Computer,National University of Defense Technology,Changsha,China;2.National Laboratory for Parallel and Distributed Processing,National University of Defense Technology,Changsha,China
Abstract:In this study, we propose and compare stochastic variants of the extra-gradient alternating direction method, named the stochastic extra-gradient alternating direction method with Lagrangian function (SEGL) and the stochastic extra-gradient alternating direction method with augmented Lagrangian function (SEGAL), to minimize the graph-guided optimization problems, which are composited with two convex objective functions in large scale. A number of important applications in machine learning follow the graph-guided optimization formulation, such as linear regression, logistic regression, Lasso, structured extensions of Lasso, and structured regularized logistic regression. We conduct experiments on fused logistic regression and graph-guided regularized regression. Experimental results on several genres of datasets demonstrate that the proposed algorithm outperforms other competing algorithms, and SEGAL has better performance than SEGL in practical use.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号