首页 | 本学科首页   官方微博 | 高级检索  
     

基于Fenchel对偶的核Logistic回归并行学习算法
引用本文:丁朋,卿湘运,王行愚.基于Fenchel对偶的核Logistic回归并行学习算法[J].自动化学报,2011,37(9):1145-1150.
作者姓名:丁朋  卿湘运  王行愚
作者单位:1.华东理工大学自动化系 上海 200237
基金项目:国家自然科学基金(61074113)资助~~
摘    要:给出了一种大规模核Logistic回归的并行学习算法.利用凸优化中的Fenchel对偶定理, 将核Logistic回归的优化原问题转换成对偶空间的优化问题,再利用块更新迭代方法, 可以独立地在部分数据集上进行分类器训练.设计了一个简单的客户机--服务器并行计算模式, 每个客户机对部分数据优化子问题,在一次优化结束后,服务器根据各客户机传递的信息修正各子问题目标函数. 在标准数据集的实验结果表明了基于Fenchel对偶的核Logistic回归并行学习算法的可行性.

关 键 词:核Logistic回归    Fenchel对偶    大规模机器学习    凸优化
收稿时间:2010-12-6
修稿时间:2011-3-2

A Parallel Learning Algorithm for Kernel Logistic Regression by Using Fenchel Duality
DING Peng,QING Xiang-Yun,WANG Xing-Yu.A Parallel Learning Algorithm for Kernel Logistic Regression by Using Fenchel Duality[J].Acta Automatica Sinica,2011,37(9):1145-1150.
Authors:DING Peng  QING Xiang-Yun  WANG Xing-Yu
Affiliation:1.Department of Automation, East China University of Science and Technology, Shanghai 200237
Abstract:A parallel learning algorithm for solving large scale kernel logistic regression problems is presented. Primal optimization problem for the kernel logistic regression is switched to the dual problem by using Fenchel duality theory in convex optimization. Then, learning the classifiers on subsets of training data can run independently when block-update methods are employed. A simple customer-server parallel computing mode is designed that each customer node learns a sub-problem for the subset of training data. Server node receives the messages passed by all customer nodes after one optimization iteration is end, followed by updating the objective functions of sub-problems. In comparison to non-parallel learning algorithms on standardized datasets, we obtain encouraging results.
Keywords:Kernel logistic regression (KLR)  Fenchel duality  large-scale machine learning  convex optimization
本文献已被 CNKI 等数据库收录!
点击此处可从《自动化学报》浏览原始摘要信息
点击此处可从《自动化学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号