首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 640 毫秒
1.
Choosing optimal parameters for support vector regression (SVR) is an important step in SVR. design, which strongly affects the pefformance of SVR. In this paper, based on the analysis of influence of SVR parameters on generalization error, a new approach with two steps is proposed for selecting SVR parameters, First the kernel function and SVM parameters are optimized roughly through genetic algorithm, then the kernel parameter is finely adjusted by local linear search, This approach has been successfully applied to the prediction model of the sulfur content in hot metal. The experiment results show that the proposed approach can yield better generalization performance of SVR than other methods,  相似文献   

2.
核函数是支持向量回归机的重要部分,每种核函数都有其优势和不足。本文基于支持向量机回归机模型相关参数的选取原则,给出了一种具有混合核函数的支持向量机,以基于网格搜索的多蚁群算法为基础,给出了此类混合核函数支持向量回归机参数优化的一种新方法。该方法以最小化交叉验证误差为目标,对包括混合比例和各类核函数的参数在内的5个参数进行优化。仿真结果表明,与遗传算法相比,本方法在参数优化方面有良好的性能,建立的预测模型精度较高。  相似文献   

3.
陶剑文 《计算机工程》2007,33(15):207-208,
为提高支持向量回归算法的学习能力和泛化性能,提出了一种优化支持向量回归参数的混合选择算法.根据训练样本的规模和噪声水平等信息,确定支持向量回归参数的取值范围,用实数编码的免疫遗传算法搜索最佳参数值.混合选择算法具有较高的精度和效率,在选择支持向量回归参数时,不必考虑模型的复杂度和变量维数.仿真实验结果表明,该算法是选择支持向量回归参数的有效方法,应用到函数逼近问题时具有优良的性能.  相似文献   

4.
王强  陈英武  邢立宁 《计算机工程》2007,33(15):40-42,6
为提高支持向量回归算法的学习能力和泛化性能,提出了一种优化支持向量回归参数的混合选择算法。根据训练样本的规模和噪声水平等信息,确定支持向量回归参数的取值范围,用实数编码的免疫遗传算法搜索最佳参数值。混合选择算法具有较高的精度和效率,在选择支持向量回归参数时,不必考虑模型的复杂度和变量维数。仿真实验结果表明,该算法是选择支持向量回归参数的有效方法,应用到函数逼近问题时具有优良的性能。  相似文献   

5.
王玲  穆志纯  郭辉 《自动化学报》2005,31(4):612-619
A new approach is proposed to model nonlinear dynamic systems by combining SOM (self-organizing feature map) with support vector regression (SVR) based on expert system. The whole system has a two-stage neural network architecture. In the first stage SOM is used as a clustering algorithm to partition the whole input space into several disjointed regions. A hierarchical architecture is adopted in the partition to avoid the problem of predetermining the number of partitioned regions. Then, in the second stage, multiple SVR, also called SVR experts, that best fit each partitioned region by the combination of different kernel function of SVR and promote the configuration and tuning of SVR. Finally, to apply this new approach to time-series prediction problems based on the Mackey-Glass differential equation and Santa Fe data, the results show that SVR experts has effective improvement in the generalization performance in comparison with the single SVR model.  相似文献   

6.
针对支持向量回归机SVR的拟合精度和泛化能力取决于相关参数的选取,提出了基于改进FS算法的SVR参数选择方法,并应用于交通流预测的研究。FS(free search)算法是一种新的进化计算方法,提出基于相对密集度的灾变策略改进FS算法的个体初始位置选择机制,以扩大搜索空间,提高全局搜索能力。对实测交通流量进行滚动预测仿真实验,结果表明该方法优化SVR参数是有效、可行的,与经验估计法和遗传算法相比,得到的SVR模型具有更好的泛化性能和预测精度。  相似文献   

7.
本文提出了一种基于支持向量回归的选矿过程精矿品位自适应在线预测方法,通过使用新的混合核函数和参数在线更新机制提高了精矿品位的预测精度.在分析经典核函数特性后,构造了一种混合核函数以兼顾模型的学习能力与泛化能力,同时为了提高预测方法对选矿生产动态过程的适应性,模型依据新工况样本对现有样本集统计特性的影响,引入了模型参数自适应调整机制,并采用在线迭代学习机制更新模型,提高了模型的计算速度.使用某选矿厂生产实际数据进行实验分析,结果表明本文方法比现有方法在计算时间和预测精度上都有明显优势,适合应用于动态变化的选矿生产过程.  相似文献   

8.
This paper addresses the problem of automatically tuning multiple kernel parameters for the kernel-based linear discriminant analysis (LDA) method. The kernel approach has been proposed to solve face recognition problems under complex distribution by mapping the input space to a high-dimensional feature space. Some recognition algorithms such as the kernel principal components analysis, kernel Fisher discriminant, generalized discriminant analysis, and kernel direct LDA have been developed in the last five years. The experimental results show that the kernel-based method is a good and feasible approach to tackle the pose and illumination variations. One of the crucial factors in the kernel approach is the selection of kernel parameters, which highly affects the generalization capability and stability of the kernel-based learning methods. In view of this, we propose an eigenvalue-stability-bounded margin maximization (ESBMM) algorithm to automatically tune the multiple parameters of the Gaussian radial basis function kernel for the kernel subspace LDA (KSLDA) method, which is developed based on our previously developed subspace LDA method. The ESBMM algorithm improves the generalization capability of the kernel-based LDA method by maximizing the margin maximization criterion while maintaining the eigenvalue stability of the kernel-based LDA method. An in-depth investigation on the generalization performance on pose and illumination dimensions is performed using the YaleB and CMU PIE databases. The FERET database is also used for benchmark evaluation. Compared with the existing PCA-based and LDA-based methods, our proposed KSLDA method, with the ESBMM kernel parameter estimation algorithm, gives superior performance.  相似文献   

9.
将求解SVC模型的算法运用到求解SVR模型中一般要SVR模型的核函数正定且满足Mercer条件,而实际应用中利用几何框架将SVC模型转换成相应的SVR模型时,通常无法保证经转换得到的SVR模型的核函数具有正定性,从而导致SVR模型不是凸规划模型而无法求解。为解决上述问题,本文提出了一种运用扩展的序列最小最优化方法(SMO)来求解基于非正定核的SVR模型,设计了算法中工作集的选择准则,解决了算法中如何选择工作集变量当前的最优值问题。由于该算法不要求核函数具有正定性,从而拓宽了SVR模型核函数的选择范围。实验表明,该算法对基于正定或非正定核的SVR模型都具有很好的泛化性能和回归精度,具有一定的理论意义和实用价值。  相似文献   

10.
An Incremental Learning Strategy for Support Vector Regression   总被引:1,自引:0,他引:1  
Support vector machine (SVM) provides good generalization performance but suffers from a large amount of computation. This paper presents an incremental learning strategy for support vector regression (SVR). The new method firstly formulates an explicit expression of ||W||2 by constructing an orthogonal basis in feature space together with a basic Hilbert space identity, and then finds the regression function through minimizing the formula of ||W||2 rather than solving a convex programming problem. Particularly, we combine the minimization of ||W||2 with kernel selection that can lead to good generalization performance. The presented method not only provides a novel way for incremental SVR learning, but opens an opportunity for model selection of SVR as well. An artificial data set, a benchmark data set and a real-world data set are employed to evaluate the method. The simulations support the feasibility and effectiveness of the proposed approach.  相似文献   

11.
正则化最小二乘分类(RLSC)是一种基于二次损失函数的正则化网络,其推广能力受模型参数影响,传统的模型选择方法是耗时的参数网格搜索.为此,提出一种新颖的AlignLoo模型选择方法,其关键在于将核参数与超参数分开优化,即最大化核-目标配准以选择最优核参数,最小化RLSC的留一法误差的界以选择最优超参数.该方法效率高且不需验证样本,并在IDA数据集上进行了测试,结果表明方法有效.  相似文献   

12.
为解决SVR(支持向量回归)自动模型选择的问题,提出一种基于梯度下降算法的支持向量回归机模型参数优化方法.通过最小化模型选择准则R2w2,对核参数集采用梯度下降算法得到局部最优的模型参数.依据黎曼几何为理论,提出一种适合于SVR的保角变换,对核函数进行数据依赖的改进,进一步提高SVR的泛化能力.仿真试验的结果验证了该方...  相似文献   

13.
In this article, annealing robust radial basis function networks (ARRBFNs), which consist of a radial basis function network and a support vector regression (SVR), and an annealing robust learning algorithm (ARLA) are proposed for the prediction of chaotic time series with outliers. In order to overcome the initial structural problems of the proposed neural networks, the SVR is utilized to determine the number of hidden nodes, the initial parameters of the kernel, and the initial weights for the proposed ARRBFNs. Then the ARLA that can conquer the outliers is applied to tune the parameters of the kernel and the weights in the proposed ARRBFNs under the initial structure with SVR. The simulation results of Mackey-Glass time series show that the proposed approach with different SVRs can cope with outliers and give a fast learning speed. The results of the simulation are also given to demonstrate the validity of proposed method for chaotic time series with outliers.  相似文献   

14.
In this paper, extreme learning machine (ELM) for ε-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton–Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability.  相似文献   

15.
参数的优化选择对支持向量回归机的预测精度和泛化能力影响显著,鉴于此,提出一种多智能体粒子群算法(MAPSO)寻优其参数的方法,并建立MAPSO支持向量回归模型,用于非线性系统的模型预测控制,推导出最优控制率.采用该算法对非线性系统进行仿真,并与基于粒子群算法、基于遗传算法优化支持向量回归机的模型预测控制方法和RBF神经网络的预测控制方法进行比较,结果表明,所提出的算法具有更好的控制性能,可以有效应用于非线性系统控制中.  相似文献   

16.
Loadability limits are critical points of particular interest in voltage stability assessment, indicating how much a system can be stressed from a given state before reaching instability. Thus estimating the loadability margin of a power system is essential in the real time voltage stability assessment. A new methodology is developed based on Support Vector Regression (SVR) which is the most common application form of Support Vector Machines (SVM). The proposed SVR methodology can successfully estimate the loadability margin under normal operating conditions and different loading directions. SVR has the feature of minimizing the generalization error in achieving the generalized network over the other mapping methods. In this paper, the SVR input vector is in the form of real and reactive power load, while the target vector is lambda (loading margin). To reduce both mean square error and prediction time in SVR, the kernel type and SVR parameters are chosen determined by using grid search based on 10-fold cross-validation method for the best SVR network. The results of SVRs (nu-SVR and epsilon-SVR) are compared with RBF neural networks and validated in the IEEE 30 bus system and IEEE 118 bus system at different operating scenarios. The results demonstrate the effectiveness of the proposed method for on-line prediction of loadability margins of a power system.  相似文献   

17.
为提高支持向量回归算法的学习能力和泛化性能,提出了特征选择和支持向量回归参数的联合优化方法。联合优化方法采用主成分分析产生新的特征集,以方均误差为目标计算回归精度,并应用实数编码的免疫遗传算法求解此优化问题。仿真实验结果表明,联合优化的回归精度要优于单独优化特征和支持向量回归参数,而且优化速度更快。  相似文献   

18.
针对支持向量机对时变的样本集采用单一模型建模困难的问题,提出了一种新的学习策略.首先,使用自组织映射(SOM)神经网络和k-means聚类算法对初始样本集合进行聚类.然后,针对每个聚类数据集合,通过最优加权组合不同核函数的支持向量回归模型建立最终的模型.实验表明,采用这种学习策略的建模精度要优于单一支持向量回归建模方法.  相似文献   

19.
基于支持向量回归机的公路货运量预测模型*   总被引:3,自引:1,他引:2  
为了提高公路货运量预测的能力,应用基于结构风险最小化准则的标准支持向量回归机方法来研究公路货运量预测问题.在选择适当的参数和核函数的基础上,通过对成都公路货运量时间序列进行预测,并与人工神经网络、线性回归分析等方法进行了对比,发现该方法能获得最小的训练相对误差和测试相对误差.  相似文献   

20.
针对突发事件下城市道路车辆排队系统的特点,从时空角度综合考虑车辆排队系统的影响因素,建立支持向量回归(SVR)动态模型对车辆排队长度进行预测。考虑到参数选择对模型性能影响的敏感性,提出了以k折交叉验证(k-CV)均方误差平均值为适应度的粒子群优化(PSO)方法并对SVR模型参数进行寻优。用提出的PSO-SVR模型与K-CV和遗传算法(GA)优化的SVR模型以及BP网络预测模型对比,实验结果表明,该模型具有较高的预测精度和泛化能力,适用于车辆排队长度的预测。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号