首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A least squares support vector fuzzy regression model(LS-SVFR) is proposed to estimate uncertain and imprecise data by applying the fuzzy set principle to weight vectors.This model only requires a set of linear equations to obtain the weight vector and the bias term,which is different from the solution of a complicated quadratic programming problem in existing support vector fuzzy regression models.Besides,the proposed LS-SVFR is a model-free method in which the underlying model function doesn’t need to be predefined.Numerical examples and fault detection application are applied to demonstrate the effectiveness and applicability of the proposed model.  相似文献   

2.
Primal least squares twin support vector regression   总被引:1,自引:0,他引:1  
The training algorithm of classical twin support vector regression (TSVR) can be attributed to the solution of a pair of quadratic programming problems (QPPs) with inequality constraints in the dual space.However,this solution is affected by time and memory constraints when dealing with large datasets.In this paper,we present a least squares version for TSVR in the primal space,termed primal least squares TSVR (PLSTSVR).By introducing the least squares method,the inequality constraints of TSVR are transformed into equality constraints.Furthermore,we attempt to directly solve the two QPPs with equality constraints in the primal space instead of the dual space;thus,we need only to solve two systems of linear equations instead of two QPPs.Experimental results on artificial and benchmark datasets show that PLSTSVR has comparable accuracy to TSVR but with considerably less computational time.We further investigate its validity in predicting the opening price of stock.  相似文献   

3.
Combining reduced technique with iterative strategy, we propose a recursive reduced least squares support vector regression. The proposed algorithm chooses the data which make more contribution to target function as support vectors, and it considers all the constraints generated by the whole training set. Thus it acquires less support vectors, the number of which can be arbitrarily predefined, to construct the model with the similar generalization performance. In comparison with other methods, our algorithm also gains excellent parsimoniousness. Numerical experiments on benchmark data sets confirm the validity and feasibility of the presented algorithm. In addition, this algorithm can be extended to classification.  相似文献   

4.
In this paper, we propose a novel approach, termed as regularized least squares fuzzy support vector regression, to handle financial time series forecasting. Two key problems in financial time series forecasting are noise and non-stationarity. Here, we assign a higher membership value to data samples that contain more relevant information, where relevance is related to recency in time. The approach requires only a single matrix inversion. For the linear case, the matrix order depends only on the dimension in which the data samples lie, and is independent of the number of samples. The efficacy of the proposed algorithm is demonstrated on financial datasets available in the public domain.  相似文献   

5.
We extend LS-SVM to ordinal regression, which has wide applications in many domains such as social science and information retrieval where human-generated data play an important role. Most current methods based on SVM for ordinal regression suffer from the problem of ignoring the distribution information reflected by the samples clustered around the centers of each class. This problem would degrade the performance of SVM-based methods since the classifiers only depend on the scattered samples on the border which induce large margin. Our method takes the samples clustered around class centers into account and has a competitive computational complexity. Moreover, our method would easily produce the optimal cut-points according to the prior class probabilities and hence may obtain more reasonable results when the prior class probabilities are not the same. Experiments on simulated datasets and benchmark datasets, especially on the real ordinal datasets, demonstrate the effectiveness of our method.  相似文献   

6.
针对二乘向量机(LS-SVM)对所有样本误差惩罚相同、预测精度不高的问题,提出了一种基于AdaBoost模型的二乘向量回归机。该算法使用多个二乘向量机按照某种学习规则协调各二乘向量机的输出,同时根据回归精度,建立各二乘向量机中每一个样本的误差惩罚权重,以突出样本的惩罚差异性,提高算法的泛化性能。实验结果表明,提出的算法提高了二乘向量回归机的预测精度,优化了学习机的性能。  相似文献   

7.
一种快速稀疏最小二乘支持向量回归机   总被引:4,自引:0,他引:4  
赵永平  孙健国 《控制与决策》2008,23(12):1347-1352
将Jiao法直接应用于最小二乘支持向量回归机上的效果并不理想,为此采用不完全抛弃的策略,提出了改进的Jiao法,并将其应用于最小二乘支持向量回归机.数据集测试的结果表明,基于改进Jiao法的稀疏最小二乘支持向量回归机,无论在支持向量个数和训练时间上都取得了一定的优势.与其他剪枝算法相比,在不丧失回归精度的情况下,改进的Jiao法可大大缩短训练时间.另外,改进的Jiao法同样适用于分类问题.  相似文献   

8.
一种改进的在线最小二乘支持向量机回归算法   总被引:4,自引:0,他引:4  
针对一般最小二乘支持向量机处理大规模数据集会出现训练速度幔、计算量大、不易在线训练的缺点,将修正后的遗忘因子矩形窗方法与支持向量机相结合,提出一种基于改进的遗忘因子矩形窗算法的在线最小二乘支持向量机回归算法,既突出了当前窗口数据的作用,又考虑了历史数据的影响.所提出的算法可减少计算量,提高在线辨识精度.仿真算例表明了该方法的有效性.  相似文献   

9.
基于矢量基学习的最小二乘支持向量机建模   总被引:7,自引:0,他引:7  
为使最小二乘支持向量机的解具有稀疏性,本文提出了一种稀疏解算法-矢量基学习.首先引入基矢量、基矢量集与矢量空间的概念,并分析新样本矢量与矢量空间的夹角,从而推导出该样本是否为基矢量的判断准则.随着新样本的到来,在线判别支持向量,使LS-SVM的支持向量具有稀疏性.提升LS-SVM动态建模的实时性,本文进一步提出用于矢量基学习的增长记忆模式递推公式.仿真分析及水处理厂的应用实例,验证了该方法的可行性和有效性.  相似文献   

10.
基于模糊最小二乘支持向量机的软测量建模   总被引:10,自引:0,他引:10  
张英  苏宏业  褚健 《控制与决策》2005,20(6):621-624
将模糊隶属度概念引入最小二乘支持向量机,提出一种基于支持向量数据域描述的模蝴隶属度函数模型,将输入空间中的样本映射到一个高维的特征空间;然后根据其偏离数据域的程度赋予不同的隶属度.该方法提高了最小二乘支持向量机的抗噪声能力,尤其适用于未能完全揭示输入样本特性的情况.将提出的方法用于催化裂化分馏塔轻柴油凝固点的软测量建模,仿真结果表明,该模糊隶属度函数模型能够提高最小二乘支持向量机的预测精度.  相似文献   

11.
基于最小二乘模糊支持向量机的基因分类研究*   总被引:2,自引:0,他引:2  
随着大量基因表达数据的涌现,把海量的数据划分成数量相对较少的组,有助于提取对生理学和医药学等有价值的生物信息。基因分类技术能够很好地处理和分析这些基因数据。提出了一种应用于基因分类的模糊最小二乘支持向量机方法,通过设置模糊隶属度改变分类中样本的贡献属性。该方法不仅考虑了样本与类中心点的距离关系,还充分考虑样本与样本之间的关系,减弱噪声或野值样本对分类的影响。采用美国威斯康星乳腺癌数据和皮马印第安人糖尿病数据进行实验检测,均取得了很好的效果。  相似文献   

12.
This article presents an approach that can analyze the influence of tunable screws and perform a computer‐aided tuning for microwave filters. In the approach, a machine‐learning model that reveals the influence of tunable screws on the filter response is first developed by least squares support vector regression, according to some data from the tuning experience of filters. Then a computer‐aided tuning procedure based on the model is proposed, and the obtained adjusting amount of tunable screws can assist an unskilled operator to perform a fast and accurate tuning. The approach is validated by some experiments and the results confirm the effectiveness. The approach is particularly suitable to the computer‐aided tuning of volume‐producing filters. © 2010 Wiley Periodicals, Inc. Int J RF and Microwave CAE, 2010.  相似文献   

13.
针对最小二乘支持向量机缺乏稀疏性的问题,提出了一种基于边界样本的最小二乘支持向量机算法。该算法利用中心距离比来选取支持度较大的边界样本作为训练样本,从而减少了支持向量的数目,提高了算法的速度。最后将该算法在4个UCI数据集上进行实验,结果表明:在几乎不损失精度的情况下,可以得到稀疏解,且算法的识别速度有了一定的提高。  相似文献   

14.
鉴于传统在线最小二乘支持向量机在解决时变对象的回归问题时, 模型跟踪精度不高, 支持向量不够稀疏, 结合迭代策略和约简技术, 提出一种在线自适应迭代约简最小二乘支持向量机. 该方法考虑新增样本与历史数据共同作用对现有模型产生的约束影响, 寻求对目标函数贡献最大的样本作为新增支持向量, 实现了支持向量稀疏化, 提高了在线预测精度与速度. 仿真对比分析表明该方法可行有效, 较传统方法回归精度高且所需支持向量数目最少.  相似文献   

15.
The least squares support vector machine (LS-SVM) is a modified version of SVM, which uses the equality constraints to replace the original convex quadratic programming problem. Consequently, the global minimizer is much easier to obtain in LS-SVM by solving the set of linear equation. LS-SVM has shown to exhibit excellent classification performance in many applications. In this paper, a wavelet-based image denoising using LS-SVM is proposed. Firstly, the noisy image is decomposed into different subbands of frequency and orientation responses using the wavelet transform. Secondly, the feature vector for a pixel in a noisy image is formed by the spatial regularity in wavelet domain, and the LS-SVM model is obtained by training. Then the wavelet coefficients are divided into two classes (noisy coefficients and noise-free ones) by LS-SVM training model. Finally, all noisy wavelet coefficients are relatively well denoised by soft-thresholding method. Extensive experimental results demonstrate that our method can obtain better performances in terms of both subjective and objective evaluations than those state-of-the-art denoising techniques. Especially, the proposed method can preserve edges very well while removing noise.  相似文献   

16.
17.
对SVM的特征提取问题进行了研究,提出了KPLS-SVM组合回归建模方法.该方法在输入空间映射得到的高维特征空间中进行PLS特征提取后,再进行SVM回归,不仅保持了SVM良好的模型性能,并且兼具KPLS和SVM的优点.仿真和实验结果表明,该KPLS-SVM建模方法是正确且有效的,采用该方法构建的SVM模型,泛化性能明显优于没有特征提取的SVM.  相似文献   

18.
Given n training examples, the training of a least squares support vector machine (LS-SVM) or kernel ridge regression (KRR) corresponds to solving a linear system of dimension n. In cross-validating LS-SVM or KRR, the training examples are split into two distinct subsets for a number of times (l) wherein a subset of m examples are used for validation and the other subset of (n-m) examples are used for training the classifier. In this case l linear systems of dimension (n-m) need to be solved. We propose a novel method for cross-validation (CV) of LS-SVM or KRR in which instead of solving l linear systems of dimension (n-m), we compute the inverse of an n dimensional square matrix and solve l linear systems of dimension m, thereby reducing the complexity when l is large and/or m is small. Typical multi-fold, leave-one-out cross-validation (LOO-CV) and leave-many-out cross-validations are considered. For five-fold CV used in practice with five repetitions over randomly drawn slices, the proposed algorithm is approximately four times as efficient as the naive implementation. For large data sets, we propose to evaluate the CV approximately by applying the well-known incomplete Cholesky decomposition technique and the complexity of these approximate algorithms will scale linearly on the data size if the rank of the associated kernel matrix is much smaller than n. Simulations are provided to demonstrate the performance of LS-SVM and the efficiency of the proposed algorithm with comparisons to the naive and some existent implementations of multi-fold and LOO-CV.  相似文献   

19.

In this paper, we have formulated a fuzzy least squares version of recently proposed clustering method, namely twin support vector clustering (TWSVC). Here, a fuzzy membership value of each data pattern to different cluster is optimized and is further used for assigning each data pattern to one or other cluster. The formulation leads to finding k cluster center planes by solving modified primal problem of TWSVC, instead of the dual problem usually solved. We show that the solution of the proposed algorithm reduces to solving a series of system of linear equations as opposed to solving series of quadratic programming problems along with system of linear equations as in TWSVC. The experimental results on several publicly available datasets show that the proposed fuzzy least squares twin support vector clustering (F-LS-TWSVC) achieves comparable clustering accuracy to that of TWSVC with comparatively lesser computational time. Further, we have given an application of F-LS-TWSVC for segmentation of color images.

  相似文献   

20.
潘宇雄  任章  李清东 《控制与决策》2014,29(12):2297-2300
为了对涡扇发动机的运行参数变化进行实时高精度预测,提出一种基于动态贝叶斯最小二乘支持向量机(LS-SVM)的时间序列预测算法。该算法将贝叶斯证据框架理论用于推断LS-SVM的初始模型参数;然后,利用样本增减迭代学习算法实现LS-SVM的参数动态调整。对某型涡扇发动机的摩擦力矩时间序列进行动态预测,并与动态LS-SVM模型的预测结果进行比较。结果显示,动态贝叶斯LS-SVM具有较好的预测精度。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号