首页 | 本学科首页   官方微博 | 高级检索  
     

支持向量机理论与基于规划的神经网络学习算法
引用本文:张铃.支持向量机理论与基于规划的神经网络学习算法[J].计算机学报,2001,24(2):113-118.
作者姓名:张铃
作者单位:安徽大学人工智能研究所
基金项目:国家“九七三”项目基金! (G19980 30 5 0 9)资助
摘    要:近年来支持向量机(SVM)理论得到国外学者高度的重视,普遍认为这是神经网络学习的新研究方向,近来也开始得到国内学者的注意。该文将研究SVM理论与神经网络的规划算法的关系,首先指出,Vapnik的基于SVM的算法与该文作者1994年提出的神经网络的基于规划的算法是等价的,即在样本集是线性可分的情况下,二者求得的均是最大边缘(maximal margin)解。不同的是,前者(通常用拉格郎日乘子法)求解的复杂性将随规模呈指数增长,而后者的复杂性是规模的多项式函数。其次,作者将规划算法化为求一点到某一凸集上的投影,利用这个几何的直观,给出一个构造性的迭代求解算法--“单纯形迭代算法”。新算法有很强的几何直观性,这个直观性将加深对神经网络(线性可分情况下)学习的理解,并由此导出一个样本集是线性可分的充分必要条件。另外,新算法对知识扩充问题,给出一个非常方便的增量学习算法。最后指出,“将一些必须满足的条件,化成问题的约束条件,将网络的某一性能,作为目标函数,将网络的学习问题化为某种规划问题来求解”的原则,将是研究神经网络学习问题的一个十分有效的办法。

关 键 词:支持向量机  神经网络  学习算法  单纯形迭代算法
修稿时间:2000年7月24日

The Theory of SVM and Programming Based Learning Algorithms in Neural Networks
ZHANG Ling.The Theory of SVM and Programming Based Learning Algorithms in Neural Networks[J].Chinese Journal of Computers,2001,24(2):113-118.
Authors:ZHANG Ling
Abstract:Recently Support Vector Machine (SVM) theory has received universal recognition by many researchers abroad. It was regarded as a new research direction of neural network's learning and has recognized by people in China as well. The relationship between SVM based algorithms and programming based learning algorithms in neural networks is addressed in this paper. First, we will show that SVM based algorithms (Vapnik) and programming based algorithms we presented in 1999 are equivalent. That is, under the linear separability condition, both can obtain the maximal-margin (optimal) solutions. In order to have the optimal solution, the computational complexity of the SVM based algorithms grows exponentially with the training sample size, since the Lagrange multiplier method is often used in the algorithms. But the latter only has polynomial complexity. Second, we show that the essence of programming based algorithms is to solve the projection of a point on some convex set. By using the above geometric intuition, an iterative algorithm, the simplex iterative algorithm, is presented. From the geometric intuition, we can also get a deeper understanding of neural network's learning (under linear separability). Then, the necessary and sufficient conditions of linear separability of training samples are obtained. Finally, we show that the following principle can be used as a new way of solving neural network's learning problem. The principle is:“regarding the conditions the network should satisfy as constraints, and the performances the network should reach as objective functions, the learning of a neural network can be regarded as a programming problem”.
Keywords:support vector machine  programming  neural network  learning algorithm
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号