首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
一种改进的在线最小二乘支持向量机回归算法   总被引:4,自引:0,他引:4  
针对一般最小二乘支持向量机处理大规模数据集会出现训练速度幔、计算量大、不易在线训练的缺点,将修正后的遗忘因子矩形窗方法与支持向量机相结合,提出一种基于改进的遗忘因子矩形窗算法的在线最小二乘支持向量机回归算法,既突出了当前窗口数据的作用,又考虑了历史数据的影响.所提出的算法可减少计算量,提高在线辨识精度.仿真算例表明了该方法的有效性.  相似文献   

3.
时变过程在线辨识的即时递推核学习方法研究   总被引:3,自引:0,他引:3       下载免费PDF全文
为了及时跟踪非线性化工过程的时变特性, 提出即时递推核学习 (Kernel learning, KL)的在线辨识方法. 针对待预测的新样本点, 采用即时学习 (Just-in-time kernel learning, JITL)策略, 通过构造累积相似度因子, 选择与其相似的样本集建立核学习辨识模型. 为避免传统即时学习对每个待预测点都重新建模的繁琐, 利用两个临近时刻相似样本集的异同点, 采用递推方法有效添加新样本, 并删减旧模型的样本, 以快速建立新即时模型. 通过一时变连续搅拌釜式反应过程的在线辨识, 表明了所提出方法在保证计算效率的同时, 较传统递推核学习方法提高了辨识的准确程度, 能更好地辨识时变过程.  相似文献   

4.
Multi-grade processes have played an important role in the fine chemical and polymer industries. An integrated nonlinear soft sensor modeling method is proposed for online quality prediction of multi-grade processes. Several single least squares support vector regression (LSSVR) models are first built for each product grade. For online prediction of a new sample, a probabilistic analysis approach using the statistical property of steady-state grades is presented. The prediction can then be obtained using the corresponding LSSVR model if its probability of the special steady-state grade is large enough. Otherwise, the query sample is considered located in the transitional mode because it is not similar to any steady-state grade. In this situation, a just-in-time LSSVR (JLSSVR) model is constructed using the most similar samples around it. To improve the efficiency of searching for similar samples of JLSSVR, a strategy combined with the characteristics of multi-grade processes is proposed. Additionally, the similarity factor and similar samples of JLSSVR can be determined adaptively using a fast cross-validation strategy with low computational load. The superiority of the proposed soft sensor is first demonstrated through a simulation example. It is also compared with other soft sensors in terms of online prediction of melt index in an industrial plant in Taiwan.  相似文献   

5.
张淑娟  邓秀勤  刘波 《计算机科学》2017,44(Z6):119-122
针对税收收入预测存在着非线性、不稳定性和多经济因素影响的复杂性,提出用最小二乘支持向量回归机的方法对广东省从化市的税收收入进行预测,并建立数学模型。由于模型中的参数C和σ2直接影响支持向量机的预测效果,因此巧妙地融合了粒子群优化算法的思想,采用粒子群算法对参数进行寻优来确保预测模型的精确性和稳定性。仿真实验结果表明,相对于各参比模型,用粒子群算法对参数进行寻优的最小二乘支持向量回归机的预测精度有了显著提高,从而说明了该模型的有效性和实用性。  相似文献   

6.
基于支持向量机的可分离非线性动态系统辨识   总被引:3,自引:0,他引:3       下载免费PDF全文
张莉  席裕庚 《自动化学报》2005,31(6):965-969
针对状态变量和控制变量可分离的非线性动态系统模型,通过引入两个非线性核函数重新设计了标准支持向量机的回归估计模型,使之适用于非线性动态系统的辨识. 它包含两个分别关于状态变量和控制变量的非线性函数,用于辨识可分离变量非线性动态系统中的两个非线性函数.文中的仿真实验验证了我们算法用于非线性动态系统辨识的有效性.  相似文献   

7.
最小二乘双支持向量机的在线学习算法   总被引:1,自引:0,他引:1  
针对具有两个非并行分类超平面的最小二乘双支持向量机,提出了一种在线学习算法。通过利用矩阵求逆分解引理,所提在线学习算法能充分利用历史的训练结果,避免了大型矩阵的求逆计算过程,从而降低了计算的复杂性。仿真结果验证了所提学习算法的有效性。  相似文献   

8.
丁世飞  黄华娟 《软件学报》2017,28(12):3146-3155
孪生参数化不敏感支持向量回归机(Twin Parametric Insensitive Support Vector Regression,TPISVR)是最近提出的一种新型机器学习方法.和其它回归方法相比,TPISVR在处理异方差噪声方面具有独特的优势.标准TPISVR的训练算法可以归结为在对偶空间求解一对具有不等式约束的二次规划问题.然而,这种求解方法的时间消耗比较大.本文引入最小二乘思想,将TPISVR的两个二次规划问题转化为两个线性方程组,并在原始空间上直接求解,提出了最小二乘孪生参数化不敏感支持向量回归机(Least Squares TPISVR,LSTPISVR).为了解决LSTPISVR的参数选择问题,提出混沌布谷鸟优化算法,并用其对LSTPISVR的参数进行优化选择.在人工数据集和UCI数据集上的实验表明了LSTPISVR在保持精度不下降的情况下比TPISVR具有更高的运行效率.  相似文献   

9.
最小二乘支持向量机在故障诊断中的应用   总被引:1,自引:0,他引:1  
为了提高机械设备故障诊断的精度,将小波包分析与最小二乘支持向量机进行了有机的结合。首先对故障信号功率谱进行小波分解,简化了故障特征向量的提取。然后提出了一种基于最小二乘支持向量机的故障诊断模型,用二次损失函数取代支持向量机中的不敏感损失函数,将不等式约束条件变为等式约束,从而将二次规划问题转变为线性方程组的求解,用最小二乘法实现了支持向量机算法,并提出对核函数的σ参数进行动态选取,提高了诊断的准确率。仿真结果表明该模型具有较强的非线性处理和抗干扰能力。  相似文献   

10.
基于LSSVM的木材干燥建模研究   总被引:4,自引:0,他引:4  
针对木材干燥过程的强非线性特点,提出以最小二乘支持向量机LSSVM建立木材干燥基准模型.通过实验用小型木材干燥窑实际干燥过程中采集的数据作为训练样本进行仿真实验,结果表明基于LSSVM的木材干燥模型预测输出能够准确反映干燥过程木材含水率的变化,模型结构简单、预测精度高、泛化能力强,验证了LSSVM对木材干燥过程建模是一种可行而有效的方法.  相似文献   

11.
Wavelet theory has a profound impact on signal processing as it offers a rigorous mathematical framework to the treatment of multiresolution problems. The combination of soft computing and wavelet theory has led to a number of new techniques. On the other hand, as a new generation of learning algorithms, support vector regression (SVR) was developed by Vapnik et al. recently, in which ?-insensitive loss function was defined as a trade-off between the robust loss function of Huber and one that enables sparsity within the SVs. The use of support vector kernel expansion also provides us a potential avenue to represent nonlinear dynamical systems and underpin advanced analysis. However, for the support vector regression with the standard quadratic programming technique, the implementation is computationally expensive and sufficient model sparsity cannot be guaranteed. In this article, from the perspective of model sparsity, the linear programming support vector regression (LP-SVR) with wavelet kernel was proposed, and the connection between LP-SVR with wavelet kernel and wavelet networks was analyzed. In particular, the potential of the LP-SVR for nonlinear dynamical system identification was investigated.  相似文献   

12.
A prediction control algorithm is presented based on least squares support vector machines (LS-SVM) model for a class of complex systems with strong nonlinearity. The nonlinear off-line model of the controUed plant is built by LS-SVM with radial basis function (RBF) kernel. In the process of system running, the off-line model is linearized at each sampling instant, and the generalized prediction control (GPC) algorithm is employed to implement the prediction control for the controlled plant. The obtained algorithm is applied to a boiler temperature control system with complicated nonlinearity and large time delay. The results of the experiment verify the effectiveness and merit of the algorithm.  相似文献   

13.
An Incremental Learning Strategy for Support Vector Regression   总被引:1,自引:0,他引:1  
Support vector machine (SVM) provides good generalization performance but suffers from a large amount of computation. This paper presents an incremental learning strategy for support vector regression (SVR). The new method firstly formulates an explicit expression of ||W||2 by constructing an orthogonal basis in feature space together with a basic Hilbert space identity, and then finds the regression function through minimizing the formula of ||W||2 rather than solving a convex programming problem. Particularly, we combine the minimization of ||W||2 with kernel selection that can lead to good generalization performance. The presented method not only provides a novel way for incremental SVR learning, but opens an opportunity for model selection of SVR as well. An artificial data set, a benchmark data set and a real-world data set are employed to evaluate the method. The simulations support the feasibility and effectiveness of the proposed approach.  相似文献   

14.
一种基于Cholesky分解的动态无偏LS-SVM学习算法   总被引:3,自引:0,他引:3  
蔡艳宁  胡昌华 《控制与决策》2008,23(12):1363-1367
针对最小二乘支持向量机用于在线建模时存在的计算复杂性问题,提出一种动态无偏最小二乘支持向量回归模型.该模型通过改进标准最小二乘支持向量机结构风险的形式消除了偏置项.得到了无偏的最小二乘支持向量机,简化了回归系数的求解.根据模型动态变化过程中核函数矩阵的特点,设计了基于Cholesky分解的在线学习算法.该算法能充分利用历史训练结果,减少计算复杂性.仿真实验表明了所提出模型的有效性.  相似文献   

15.
The Least-trimmed-squares (LTS) estimator is a well known robust estimator in terms of protecting the estimate from the outliers. Its high computational complexity is however a problem in practice. In this paper, we propose a random LTS algorithm which has a low computational complexity that can be calculated a priori as a function of the required error bound and the confidence interval. Moreover, if the number of data points goes to infinite, the algorithm becomes a deterministic one that converges to the true LTS in some probability sense.  相似文献   

16.
    
In the past decade, twin support vector machine (TWSVM) based classifiers have received considerable attention from the research community. In this paper, we analyze the performance of 8 variants of TWSVM based classifiers along with 179 classifiers evaluated in Fernandez-Delgado et al. (2014) from 17 different families on 90 University of California Irvine (UCI) benchmark datasets from various domains. Results of these classifiers are exhaustively analyzed using various performance criteria. Statistical testing is performed using Friedman Rank (FRank). Our experiments show that two least square TWSVM based classifiers (ILSTSVM_m, and RELS-TSVM_m) are the top two ranked methods among 187 classifiers and they significantly outperform all other classifiers according to Friedman Rank. Overall, this paper bridges the evaluational benchmarking gap between various TWSVM variants and the classifiers from other families. Codes of this paper are provided on authors’ homepages to reproduce the presented results and figures in this paper.  相似文献   

17.
Combining reduced technique with iterative strategy, we propose a recursive reduced least squares support vector regression. The proposed algorithm chooses the data which make more contribution to target function as support vectors, and it considers all the constraints generated by the whole training set. Thus it acquires less support vectors, the number of which can be arbitrarily predefined, to construct the model with the similar generalization performance. In comparison with other methods, our algorithm also gains excellent parsimoniousness. Numerical experiments on benchmark data sets confirm the validity and feasibility of the presented algorithm. In addition, this algorithm can be extended to classification.  相似文献   

18.
一种快速稀疏最小二乘支持向量回归机   总被引:4,自引:0,他引:4  
赵永平  孙健国 《控制与决策》2008,23(12):1347-1352
将Jiao法直接应用于最小二乘支持向量回归机上的效果并不理想,为此采用不完全抛弃的策略,提出了改进的Jiao法,并将其应用于最小二乘支持向量回归机.数据集测试的结果表明,基于改进Jiao法的稀疏最小二乘支持向量回归机,无论在支持向量个数和训练时间上都取得了一定的优势.与其他剪枝算法相比,在不丧失回归精度的情况下,改进的Jiao法可大大缩短训练时间.另外,改进的Jiao法同样适用于分类问题.  相似文献   

19.
基于小波变换和AR-LSSVM的非平稳时间序列预测   总被引:5,自引:1,他引:4  
提出一种基于二进正交小波变换和AR-LSSVM方法的非平稳时间序列预测方案.首先利用Mallat算法对非平稳时同序列进行分解和重构,分离出非平稳时间序列中的低频信息和高频信息;然后对高频信息构建自回归模型,对低频信息则用最小二乘支持向量机进行拟合;最后将各模型的预测结果进行叠加,从而得到原始序列的预测值.研究结果表明,该方法不仅能充分拟合低频信息,而且可避免对高频信息的过拟合.  相似文献   

20.
Health trend prediction has become an effective way to ensure the safe operation of highly reliable systems,and online prediction is always necessary in many real applications.To simultaneously obtain better or acceptable online prediction accuracy and shorter computing time,we propose a new adaptive online method based on least squares support vector regression(LS-SVR).This method adopts two approaches.One approach is that we delete certain support vectors by judging the linear correlation among the samples to increase the sparseness of the prediction model.This approach can control the loss of useful information in sample data,improve the generalization capability of the prediction model,and reduce the prediction time.The other approach is that we reduce the number of traditional LS-SVR parameters and establish a modified simple prediction model.This approach can reduce the calculation time in the process of adaptive online training.Simulation and a certain electric system application indicate preliminarily that the proposed method is an effective prediction approach for its good prediction accuracy and low computing time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号