共查询到20条相似文献,搜索用时 31 毫秒
1.
As a new sparse kernel modeling method, support vector regression (SVR) has been regarded as the state-of-the-art technique for regression and approximation. In [V.N. Vapnik, The Nature of Statistical Learning Theory, second ed., Springer-Verlag, 2000], Vapnik developed the ?-insensitive loss function for the support vector regression as a trade-off between the robust loss function of Huber and one that enables sparsity within the support vectors. The use of support vector kernel expansion provides us a potential avenue to represent nonlinear dynamical systems and underpin advanced analysis. However, in the standard quadratic programming support vector regression (QP-SVR), its implementation is often computationally expensive and sufficient model sparsity cannot be guaranteed. In an attempt to mitigate these drawbacks, this article focuses on the application of the soft-constrained linear programming support vector regression (LP-SVR) with hybrid kernel in nonlinear black-box systems identification. An innovative non-Mercer hybrid kernel is explored by leveraging the flexibility of LP-SVR in choosing the kernel functions. The simulation results demonstrate the ability to use more general kernel function and the inherent performance advantage of LP-SVR to QP-SVR in terms of model sparsity and computational efficiency. 相似文献
2.
3.
基于支持向量回归理论和小波支持向量核函数,提出了一种新的SAR滤波方法。首先对支持向量回归方法做了分析,通过对复杂信号进行逼近实验,验证了其应用于图像滤波的可行性和合理性。之后将SAR图像看成是一个二维连续信号,将对复杂信号具有更好逼近能力的小波支持向量核函数用于SAR图像滤波,小波核函数由Morlet小波构建。实验结果表明本文提出的方法能很好的降低SAR图像噪声,而且能比传统方法更好的保持边缘。 相似文献
4.
Wavelet support vector machine 总被引:28,自引:0,他引:28
Li Zhang Weida Zhou Licheng Jiao 《IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics》2004,34(1):34-39
An admissible support vector (SV) kernel (the wavelet kernel), by which we can construct a wavelet support vector machine (SVM), is presented. The wavelet kernel is a kind of multidimensional wavelet function that can approximate arbitrary nonlinear functions. The existence of wavelet kernels is proven by results of theoretic analysis. Computer simulations show the feasibility and validity of wavelet support vector machines (WSVMs) in regression and pattern recognition. 相似文献
5.
6.
针对现有传统的一些图像去噪方法难以获得清晰图像边缘的问题,提出了利用ε-SVR技术构建图像去噪滤波器的新方法。ε-支持向量回归机通过引入ε不敏感损失函数,可以实现具有较强鲁棒性的回归,而且回归估计是稀疏的,保留了SVM的所有优点。分析了ε-支持向量回归机理论算法及其在图像去噪中的应用,使用ε-支持向量回归机对图像进行滤波并且与最小值滤波、均值滤波和维纳滤波等常用的滤波方法相比较,还比较了SVM各种核函数对不同噪声的滤波效果和分析了不同阶数的Multinomial核的滤波效果。实验结果表明了ε-支持向量回归机能够有效地去除噪声,不但信噪比较高而且比较清晰,同时具有良好的稀疏性。 相似文献
7.
8.
基于支持向量机核函数的条件,将Sobolev Hilbert空间的再生核函数进行改进,给出一种新的支持向量机核函数,并提出一种改进的最小二乘再生核支持向量机的回归模型,该回归模型的参数被减少,且仿真实验结果表明:最小二乘支持向量机的核函数采用改进的再生核函数是可行的,改进后的再生核函数不仅具有核函数的非线性映射特征,而且也继承了该再生核函数对非线性逐级精细逼近的特征,回归的效果比一般的核函数更为细腻。 相似文献
9.
Bayesian support vector regression using a unified loss function 总被引:4,自引:0,他引:4
In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets. 相似文献
10.
In some nonlinear dynamic systems, the state variables function usually can be separated from the control variables function, which brings much trouble to the identification of such systems. To well solve this problem, an improved least squares support vector regression (LSSVR) model with multiple-kernel is proposed and the model is applied to the nonlinear separable system identification. This method utilizes the excellent nonlinear mapping ability of Morlet wavelet kernel function and combines the state and control variables information into a kernel matrix. Using the composite wavelet kernel, the LSSVR includes two nonlinear functions, whose variables are the state variables and the control ones respectively, in this way, the regression function can gain better nonlinear mapping ability, and it can simulate almost any curve in quadratic continuous integral space. Then, they are used to identify the two functions in the separable nonlinear dynamic system. Simulation results show that the multiple-kernel LSSVR method can greatly improve the identification accuracy than the single kernel method, and the Morlet wavelet kernel is more efficient than the other kernels. 相似文献
11.
戴宏亮 《计算机工程与应用》2010,46(7):15-17
针对瓦斯涌出量的局部性、随机性、模糊性等特点,提出一种新的小波支持向量核构造小波支持向量回归模型,并且运用一种新型的智能遗传算法优选模型参数。实验结果表明,所提出的小波支持向量回归模型预测瓦斯涌出量比标准支持向量回归模型、智能支持向量回归模型预测精度高、速度快。 相似文献
12.
Enforcing sparsity constraints has been shown to be an effective and efficient way to obtain state-of-the-art results in regression
and classification tasks. Unlike the support vector machine (SVM) the relevance vector machine (RVM) explicitly encodes the
criterion of model sparsity as a prior over the model weights. However the lack of an explicit prior structure over the weight
variances means that the degree of sparsity is to a large extent controlled by the choice of kernel (and kernel parameters).
This can lead to severe overfitting or oversmoothing—possibly even both at the same time (e.g. for the multiscale Doppler
data). We detail an efficient scheme to control sparsity in Bayesian regression by incorporating a flexible noise-dependent
smoothness prior into the RVM. We present an empirical evaluation of the effects of choice of prior structure on a selection
of popular data sets and elucidate the link between Bayesian wavelet shrinkage and RVM regression. Our model encompasses the
original RVM as a special case, but our empirical results show that we can surpass RVM performance in terms of goodness of
fit and achieved sparsity as well as computational performance in many cases. The code is freely available.
Action Editor: Dale Schuurmans. 相似文献
13.
基于小波核LS—SVM的网络流量预测 总被引:3,自引:0,他引:3
网络流量预测对大规模网络管理、规划、设计具有重要意义。支持向量机方法是近年来发展起来的新型机器学习算法,用于解决高度非线性分类及回归问题。介绍了基于小波核最小二乘支持向量机的网络流量预测方法,利用小波核函数的多分辨特性提高了支持向量机的非线性建模能力。通过对实测网络流量数据的学习,对未来网络流量进行预测。实验结果表明,取得了较好的预测效果。 相似文献
14.
15.
ε-支持向量回归机算法及其应用 总被引:2,自引:0,他引:2
针对现有传统的一些图像去噪方法难以获得清晰图像边缘的问题,提出了利用ε-SVR技术构建图像去噪滤波器的新方法。ε-支持向量回归机通过引入ε不敏感损失函数,可以实现具有较强鲁棒性的回归,而且回归估计是稀疏的,保留了SVM的所有优点。分析了ε-支持向量回归机理论算法及其在图像去噪中的应用,使用ε-支持向量回归机对图像进行滤波并且与最小值滤波、均值滤波和维纳滤波等常用的滤波方法相比较,还比较了SVM各种核函数对不同噪声的滤波效果和分析了不同阶数的Multinomial核的滤波效果。实验结果表明了ε-支持向量回归机能够有效地去除噪声,不但信噪比较高而且比较清晰,同时具有良好的稀疏性。 相似文献
16.
水质系统是一个开放的、复杂的、非线性动力学系统,具有时变复杂性,针对水质预测方法的研究虽然已经取得了一些成果,但也存在预测精度与计算复杂度等难题。为此,本文提出一种基于最小二乘支持向量回归的水质预测算法。支持向量机是机器学习中一种常用的分类模型,通过核函数将非线性数据从低维映射到高维空间,在高维空间实现线性分类和回归,最小二乘支持向量回归(LS-SVR)利用所有的样本参与回归拟合,使得回归的损失函数不再只与小部分支持向量样本有关,而是由所有样本参与学习修正误差,提高预测精度;同时该算法将标准SVR求解问题由不等式的约束条件及凸二次规划问题转化成线性方程组来求解,提高了运算速度,解决了非线性复杂特性的水质预测问题。 相似文献
17.
Qi Wu 《Expert systems with applications》2011,38(12):14478-14489
This paper presents a new version of fuzzy wavelet support vector regression machine to forecast the nonlinear fuzzy system with multi-dimensional input variables. The input and output variables of the proposed model are described as triangular fuzzy numbers. Then by integrating the triangular fuzzy theory, wavelet analysis theory and ν-support vector regression machine, a polynomial slack variable is also designed, the triangular fuzzy robust wavelet ν-support vector regression machine (TFRWν-SVM) is proposed. To seek the optimal parameters of TFRWν-SVM, particle swarm optimization is also applied to optimize parameters of TFRWν-SVM. A forecasting method based on TFRWν-SVRM and PSO are put forward. The results of the application in sale system forecasts confirm the feasibility and the validity of the forecasting method. Compared with the traditional model, TFRWν-SVM method requires fewer samples and has better forecasting precision. 相似文献
18.
冼广铭 《计算机工程与应用》2008,44(18):36-38
针对目前使用的SVM核函数在回归中不能逼近任意目标函数的问题,在支持向量机的核方法和小波框架理论的基础上,提出了LS-WSVM结构模型。该模型在LS-SVM中使用一种新的由小波构成的SVM核函数。实验结果表明,与标准的SVM及LS-SVM比较起来,在同等条件下,LS-WSVM在函数回归方面LS-WSVM具有优良的逼近性能,拟合效果更为细腻。 相似文献
19.
最小二乘Littlewood-Paley小波支持向量机 总被引:11,自引:0,他引:11
基于小波分解理论和支持向量机核函数的条件,提出了一种多维允许支持向量核函数——Littlewood-Paley小波核函数.该核函数不仅具有平移正交性,而且可以以其正交性逼近二次可积空间上的任意曲线,从而提升了支持向量机的泛化性能.在Littlewood-Paley小波函数作为支持向量核函数的基础上,提出了最小二乘Littlewood-Paley小波支持向量机(LS-LPWSVM).实验结果表明,LS-LPWSVM在同等条件下比最小二乘支持向量机的学习精度要高,因而更适用于复杂函数的学习问题. 相似文献
20.
Yuh-Jye Lee Wen-Feng Hsieh Chien-Ming Huang 《Knowledge and Data Engineering, IEEE Transactions on》2005,17(5):678-685
A new smoothing strategy for solving /spl epsi/-support vector regression (/spl epsi/-SVR), tolerating a small error in fitting a given data set linearly or nonlinearly, is proposed in this paper. Conventionally, /spl epsi/-SVR is formulated as a constrained minimization problem, namely, a convex quadratic programming problem. We apply the smoothing techniques that have been used for solving the support vector machine for classification, to replace the /spl epsi/-insensitive loss function by an accurate smooth approximation. This will allow us to solve /spl epsi/-SVR as an unconstrained minimization problem directly. We term this reformulated problem as /spl epsi/-smooth support vector regression (/spl epsi/-SSVR). We also prescribe a Newton-Armijo algorithm that has been shown to be convergent globally and quadratically to solve our /spl epsi/-SSVR. In order to handle the case of nonlinear regression with a massive data set, we also introduce the reduced kernel technique in this paper to avoid the computational difficulties in dealing with a huge and fully dense kernel matrix. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm. 相似文献