共查询到18条相似文献,搜索用时 46 毫秒
1.
一种新的最小二乘支持向量机稀疏化算法 总被引:1,自引:1,他引:0
普通的最小二乘支持向量机(LS-SVM)稀疏化算法在处理有些常见的模式识别问题时,随着训练样本的删减,识别率下滑很快,往往达不到稀疏化的目的。针对这种情况,提出了一种新的LS-SVM稀疏化算法来弥补这种不足,从而使得LS-SVM稀疏化算法体系更加完善。将新算法应用到雷达一维距离像的识别中,实验结果证明了新算法的有效性。 相似文献
2.
最小二乘支持向量机算法研究 总被引:17,自引:0,他引:17
1 引言支持向量机(SVM,Support Vector Machines)是基于结构风险最小化的统计学习方法,它具有完备的统计学习理论基础和出色的学习性能,在模式识别和函数估计中得到了有效的应用(Vapnik,1995,1998)。支持向量机方法一方面通过把数据映射到高维空间,解决原始空间中数据线性不可分问题;另一方面,通过构造最优分类超平面进行数据分类。神经网络通过基于梯度迭代的方法进行数据学习,容易陷入局部最小值,支持向量机是通过解决一个二次规划问题,来获得 相似文献
3.
针对最小二乘支持向量机缺乏稀疏性的问题,提出了一种基于边界样本的最小二乘支持向量机算法。该算法利用中心距离比来选取支持度较大的边界样本作为训练样本,从而减少了支持向量的数目,提高了算法的速度。最后将该算法在4个UCI数据集上进行实验,结果表明:在几乎不损失精度的情况下,可以得到稀疏解,且算法的识别速度有了一定的提高。 相似文献
4.
最小二乘支持向量机不需要求解凸二次规划问题,通过求解一组线性方程而获得最优分类面,但是,最小二乘支持向量机失去了解的稀疏性,当训练样本数量较大时,算法的计算量非常大。提出了一种快速最小二乘支持向量机算法,在保证支持向量机推广能力的同时,算法的速度得到了提高,尤其是当训练样本数量较大时算法的速度优势更明显。新算法通过选择那些支持值较大样本作为训练样本,以减少训练样本数量,提高算法的速度;然后,利用最小二乘支持向量机算法获得近似最优解。实验结果显示,新算法的训练速度确实较快。 相似文献
5.
6.
一种高效的最小二乘支持向量机分类器剪枝算法 总被引:2,自引:0,他引:2
针对最小二乘支持向量机丧失稀疏性的问题,提出了一种高效的剪枝算法.为了避免解初始的线性代数方程组,采用了一种自下而上的策略.在训练的过程中,根据一些特定的剪枝条件,块增量学习和逆学习交替进行,一个小的支持向量集能够自动形成.使用此集合,可以构造最终的分类器.为了测试新算法的有效性,把它应用于5个UCI数据集.实验结果表明:使用新的剪枝算法,当增量块的大小等于2时,在几乎不损失精度的情况下,可以得到稀疏解.另外,和SMO算法相比,新算法的速度更快.新的算法不仅适用于最小二乘支持向量机分类器,也可向最小二乘支持向量回归机推广. 相似文献
7.
8.
一种快速稀疏最小二乘支持向量回归机 总被引:4,自引:0,他引:4
将Jiao法直接应用于最小二乘支持向量回归机上的效果并不理想,为此采用不完全抛弃的策略,提出了改进的Jiao法,并将其应用于最小二乘支持向量回归机.数据集测试的结果表明,基于改进Jiao法的稀疏最小二乘支持向量回归机,无论在支持向量个数和训练时间上都取得了一定的优势.与其他剪枝算法相比,在不丧失回归精度的情况下,改进的Jiao法可大大缩短训练时间.另外,改进的Jiao法同样适用于分类问题. 相似文献
9.
在线稀疏最小二乘支持向量机回归的研究 总被引:6,自引:0,他引:6
现有最小二乘支持向量机回归的训练和模型输出的计算需要较长的时间,不适合在线实时训练.对此,提出一种在线稀疏最小二乘支持向量机回归,其训练算法采用样本字典,减少了训练样本的计算量.训练样本采用序贯加入的方式,适合在线获取,并且该算法在理论上是收敛的.仿真结果表明,该算法具有较好的稀疏性和实时性,可进一步用于建模与实时控制等方面的研究. 相似文献
10.
一种稀疏最小二乘支持向量分类机 总被引:1,自引:0,他引:1
一般的支持向量分类机需要求解二次规划问题,最小二乘支持向量机只需求解一个线性方程组,但其缺乏稀疏性.为了改进最小二乘支持向量分类机,本文结合中心距离比值及增量学习的思想提出一种基于预选、筛选支持向量的稀疏最小二乘支持向量机.该方法既能弥补最小二乘向量机的稀疏性,减少计算机的存储量和计算量,加快最小二乘支持向量机的训练速度和决策速度,又能对非均衡训练数据造成的分类面的偏移进行纠正,还不影响最小二乘支持向量机的分类能力.3组实验结果也证实了这一点. 相似文献
11.
In this Letter an efficient recursive update algorithm for least squares support vector machines (LSSVMs) is developed. Using the previous solution and some matrix equations, the algorithm completely avoids training the LSSVM all over again whenever new training sample is available. The gain in speed using the recursive update algorithm is illustrated on four data sets from UCI repository: the Statlog Australian credit, the Pima Indians diabetes, the Wisconsin breast cancer, and the adult income data sets. 相似文献
12.
Digital Least Squares Support Vector Machines 总被引:1,自引:0,他引:1
This paper presents a very simple digital architecture that implements a Least-Squares Support Vector Machine. The simplicity of the whole system and its good behavior when used to solve classification problems hold good prospects for the application of such a kind of learning machines to build embedded systems. 相似文献
13.
天线阵列的宽频段测向特性十分复杂,使采用智能学习的方法对波达方向进行估计时,面临着一个海量数据的复杂学习问题.采用LS-SVM建立来波方位估计模型,对LS-SVM的支持向量进行稀疏化,利用支持度高的支持向量作为训练样本,并通过二次学习获取了天线阵列的复杂测向能力,实现了宽频段波达方向的估计.实验结果表明,用稀疏化的支持向量进行二次学习,能显著提高来波方位估计的精度,在宽频段来波方位估计中有巨大的应用价值. 相似文献
14.
LI Li-Juan SU Hong-Ye CHU Jian 《自动化学报》2007,33(11):1182-1188
This paper proposes a practical generalized predictive control (GPC) algorithm based on online least squares support vector machines (LS-SVM) which can deal with nonlinear systems effectively. At each sampling period the algorithm recursively modifies the model by adding a new data pair and deleting the least important one out of the consideration on realtime property. The data pair deleted is determined by the absolute value of lagrange multiplier from last sampling period. The paper gives the recursive algorithm of model parameters when adding a new data pair and deleting an existent one, respectively, and thus the inversion of a large matrix is avoided and the memory can be controlled by the algorithm entirely. The nonlinear LS-SVM model is applied in GPC algorithm at each sampling period. The experiments of generalized predictive control on pH neutralizing process show the effectiveness and practicality of the proposed algorithm. 相似文献
15.
Benchmarking Least Squares Support Vector Machine Classifiers 总被引:16,自引:0,他引:16
van Gestel Tony Suykens Johan A.K. Baesens Bart Viaene Stijn Vanthienen Jan Dedene Guido de Moor Bart Vandewalle Joos 《Machine Learning》2004,54(1):5-32
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization problems are represented by a set of binary classifiers using different output coding schemes. While regularization is used to control the effective number of parameters of the LS-SVM classifier, the sparseness property of SVMs is lost due to the choice of the 2-norm. Sparseness can be imposed in a second stage by gradually pruning the support value spectrum and optimizing the hyperparameters during the sparse approximation procedure. In this paper, twenty public domain benchmark datasets are used to evaluate the test set performance of LS-SVM classifiers with linear, polynomial and radial basis function (RBF) kernels. Both the SVM and LS-SVM classifier with RBF kernel in combination with standard cross-validation procedures for hyperparameter selection achieve comparable test set performances. These SVM and LS-SVM performances are consistently very good when compared to a variety of methods described in the literature including decision tree based algorithms, statistical algorithms and instance based learning methods. We show on ten UCI datasets that the LS-SVM sparse approximation procedure can be successfully applied. 相似文献
16.
基于在线最小二乘支持向量机的广义预测控制 总被引:5,自引:0,他引:5
This paper proposes a practical generalized predictive control (GPC) algorithm based on online least squares support vector machines (LS-SVM) which can deal with nonlinear systems effectively. At each sampling period the algorithm recursively modifies the model by adding a new data pair and deleting the least important one out of the consideration on realtime property. The data pair deleted is determined by the absolute value of lagrange multiplier from last sampling period. The paper gives the recursive algorithm of model parameters when adding a new data pair and deleting an existent one, respectively, and thus the inversion of a large matrix is avoided and the memory can be controlled by the algorithm entirely. The nonlinear LS-SVM model is applied in GPC algorithm at each sampling period. The experiments of generalized predictive control on pH neutralizing process show the effectiveness and practicality of the proposed algorithm. 相似文献
17.
18.