共查询到20条相似文献,搜索用时 640 毫秒
1.
Parameter selection of support vector regression based on hybrid optimization algorithm and its application 总被引:1,自引:0,他引:1
Choosing optimal parameters for support vector regression (SVR) is an important step in SVR. design, which strongly affects the pefformance of SVR. In this paper, based on the analysis of influence of SVR parameters on generalization error, a new approach with two steps is proposed for selecting SVR parameters, First the kernel function and SVM parameters are optimized roughly through genetic algorithm, then the kernel parameter is finely adjusted by local linear search, This approach has been successfully applied to the prediction model of the sulfur content in hot metal. The experiment results show that the proposed approach can yield better generalization performance of SVR than other methods, 相似文献
2.
核函数是支持向量回归机的重要部分,每种核函数都有其优势和不足。本文基于支持向量机回归机模型相关参数的选取原则,给出了一种具有混合核函数的支持向量机,以基于网格搜索的多蚁群算法为基础,给出了此类混合核函数支持向量回归机参数优化的一种新方法。该方法以最小化交叉验证误差为目标,对包括混合比例和各类核函数的参数在内的5个参数进行优化。仿真结果表明,与遗传算法相比,本方法在参数优化方面有良好的性能,建立的预测模型精度较高。 相似文献
3.
为提高支持向量回归算法的学习能力和泛化性能,提出了一种优化支持向量回归参数的混合选择算法.根据训练样本的规模和噪声水平等信息,确定支持向量回归参数的取值范围,用实数编码的免疫遗传算法搜索最佳参数值.混合选择算法具有较高的精度和效率,在选择支持向量回归参数时,不必考虑模型的复杂度和变量维数.仿真实验结果表明,该算法是选择支持向量回归参数的有效方法,应用到函数逼近问题时具有优良的性能. 相似文献
4.
5.
A new approach is proposed to model nonlinear dynamic systems by combining SOM (self-organizing feature map) with support vector regression (SVR) based on expert system. The whole system has a two-stage neural network architecture. In the first stage SOM is used as a clustering algorithm to partition the whole input space into several disjointed regions. A hierarchical architecture is adopted in the partition to avoid the problem of predetermining the number of partitioned regions. Then, in the second stage, multiple SVR, also called SVR experts, that best fit each partitioned region by the combination of different kernel function of SVR and promote the configuration and tuning of SVR. Finally, to apply this new approach to time-series prediction problems based on the Mackey-Glass differential equation and Santa Fe data, the results show that SVR experts has effective improvement in the generalization performance in comparison with the single SVR model. 相似文献
6.
针对支持向量回归机SVR的拟合精度和泛化能力取决于相关参数的选取,提出了基于改进FS算法的SVR参数选择方法,并应用于交通流预测的研究。FS(free search)算法是一种新的进化计算方法,提出基于相对密集度的灾变策略改进FS算法的个体初始位置选择机制,以扩大搜索空间,提高全局搜索能力。对实测交通流量进行滚动预测仿真实验,结果表明该方法优化SVR参数是有效、可行的,与经验估计法和遗传算法相比,得到的SVR模型具有更好的泛化性能和预测精度。 相似文献
7.
本文提出了一种基于支持向量回归的选矿过程精矿品位自适应在线预测方法,通过使用新的混合核函数和参数在线更新机制提高了精矿品位的预测精度.在分析经典核函数特性后,构造了一种混合核函数以兼顾模型的学习能力与泛化能力,同时为了提高预测方法对选矿生产动态过程的适应性,模型依据新工况样本对现有样本集统计特性的影响,引入了模型参数自适应调整机制,并采用在线迭代学习机制更新模型,提高了模型的计算速度.使用某选矿厂生产实际数据进行实验分析,结果表明本文方法比现有方法在计算时间和预测精度上都有明显优势,适合应用于动态变化的选矿生产过程. 相似文献
8.
Jian Huang Pong C Yuen Wen-Sheng Chen Jian Huang Lai 《IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics》2007,37(4):847-862
This paper addresses the problem of automatically tuning multiple kernel parameters for the kernel-based linear discriminant analysis (LDA) method. The kernel approach has been proposed to solve face recognition problems under complex distribution by mapping the input space to a high-dimensional feature space. Some recognition algorithms such as the kernel principal components analysis, kernel Fisher discriminant, generalized discriminant analysis, and kernel direct LDA have been developed in the last five years. The experimental results show that the kernel-based method is a good and feasible approach to tackle the pose and illumination variations. One of the crucial factors in the kernel approach is the selection of kernel parameters, which highly affects the generalization capability and stability of the kernel-based learning methods. In view of this, we propose an eigenvalue-stability-bounded margin maximization (ESBMM) algorithm to automatically tune the multiple parameters of the Gaussian radial basis function kernel for the kernel subspace LDA (KSLDA) method, which is developed based on our previously developed subspace LDA method. The ESBMM algorithm improves the generalization capability of the kernel-based LDA method by maximizing the margin maximization criterion while maintaining the eigenvalue stability of the kernel-based LDA method. An in-depth investigation on the generalization performance on pose and illumination dimensions is performed using the YaleB and CMU PIE databases. The FERET database is also used for benchmark evaluation. Compared with the existing PCA-based and LDA-based methods, our proposed KSLDA method, with the ESBMM kernel parameter estimation algorithm, gives superior performance. 相似文献
9.
将求解SVC模型的算法运用到求解SVR模型中一般要SVR模型的核函数正定且满足Mercer条件,而实际应用中利用几何框架将SVC模型转换成相应的SVR模型时,通常无法保证经转换得到的SVR模型的核函数具有正定性,从而导致SVR模型不是凸规划模型而无法求解。为解决上述问题,本文提出了一种运用扩展的序列最小最优化方法(SMO)来求解基于非正定核的SVR模型,设计了算法中工作集的选择准则,解决了算法中如何选择工作集变量当前的最优值问题。由于该算法不要求核函数具有正定性,从而拓宽了SVR模型核函数的选择范围。实验表明,该算法对基于正定或非正定核的SVR模型都具有很好的泛化性能和回归精度,具有一定的理论意义和实用价值。 相似文献
10.
An Incremental Learning Strategy for Support Vector Regression 总被引:1,自引:0,他引:1
Support vector machine (SVM) provides good generalization performance but suffers from a large amount of computation. This paper presents an incremental learning strategy for support vector regression (SVR). The new method firstly formulates an explicit expression of ||W||2 by constructing an orthogonal basis in feature space together with a basic Hilbert space identity, and then finds the regression function through minimizing the formula of ||W||2 rather than solving a convex programming problem. Particularly, we combine the minimization of ||W||2 with kernel selection that can lead to good generalization performance. The presented method not only provides a novel way for incremental SVR learning, but opens an opportunity for model selection of SVR as well. An artificial data set, a benchmark data set and a real-world data set are employed to evaluate the method. The simulations support the feasibility and effectiveness of the proposed approach. 相似文献
11.
12.
13.
Yu-Yi Fu Chia-Ju Wu Chia-Nan Ko Jin-Tsong Jeng Li-Chun Lai 《Artificial Life and Robotics》2009,14(1):29-33
In this article, annealing robust radial basis function networks (ARRBFNs), which consist of a radial basis function network and a support vector regression (SVR), and an annealing robust learning algorithm (ARLA) are proposed for the prediction of chaotic time series with outliers. In order to overcome the initial structural problems of the proposed neural networks, the SVR is utilized to determine the number of hidden nodes, the initial parameters of the kernel, and the initial weights for the proposed ARRBFNs. Then the ARLA that can conquer the outliers is applied to tune the parameters of the kernel and the weights in the proposed ARRBFNs under the initial structure with SVR. The simulation results of Mackey-Glass time series show that the proposed approach with different SVRs can cope with outliers and give a fast learning speed. The results of the simulation are also given to demonstrate the validity of proposed method for chaotic time series with outliers. 相似文献
14.
In this paper, extreme learning machine (ELM) for ε-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton–Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability. 相似文献
15.
参数的优化选择对支持向量回归机的预测精度和泛化能力影响显著,鉴于此,提出一种多智能体粒子群算法(MAPSO)寻优其参数的方法,并建立MAPSO支持向量回归模型,用于非线性系统的模型预测控制,推导出最优控制率.采用该算法对非线性系统进行仿真,并与基于粒子群算法、基于遗传算法优化支持向量回归机的模型预测控制方法和RBF神经网络的预测控制方法进行比较,结果表明,所提出的算法具有更好的控制性能,可以有效应用于非线性系统控制中. 相似文献
16.
Loadability limits are critical points of particular interest in voltage stability assessment, indicating how much a system can be stressed from a given state before reaching instability. Thus estimating the loadability margin of a power system is essential in the real time voltage stability assessment. A new methodology is developed based on Support Vector Regression (SVR) which is the most common application form of Support Vector Machines (SVM). The proposed SVR methodology can successfully estimate the loadability margin under normal operating conditions and different loading directions. SVR has the feature of minimizing the generalization error in achieving the generalized network over the other mapping methods. In this paper, the SVR input vector is in the form of real and reactive power load, while the target vector is lambda (loading margin). To reduce both mean square error and prediction time in SVR, the kernel type and SVR parameters are chosen determined by using grid search based on 10-fold cross-validation method for the best SVR network. The results of SVRs (nu-SVR and epsilon-SVR) are compared with RBF neural networks and validated in the IEEE 30 bus system and IEEE 118 bus system at different operating scenarios. The results demonstrate the effectiveness of the proposed method for on-line prediction of loadability margins of a power system. 相似文献
17.
为提高支持向量回归算法的学习能力和泛化性能,提出了特征选择和支持向量回归参数的联合优化方法。联合优化方法采用主成分分析产生新的特征集,以方均误差为目标计算回归精度,并应用实数编码的免疫遗传算法求解此优化问题。仿真实验结果表明,联合优化的回归精度要优于单独优化特征和支持向量回归参数,而且优化速度更快。 相似文献
18.
19.
20.
针对突发事件下城市道路车辆排队系统的特点,从时空角度综合考虑车辆排队系统的影响因素,建立支持向量回归(SVR)动态模型对车辆排队长度进行预测。考虑到参数选择对模型性能影响的敏感性,提出了以k折交叉验证(k-CV)均方误差平均值为适应度的粒子群优化(PSO)方法并对SVR模型参数进行寻优。用提出的PSO-SVR模型与K-CV和遗传算法(GA)优化的SVR模型以及BP网络预测模型对比,实验结果表明,该模型具有较高的预测精度和泛化能力,适用于车辆排队长度的预测。 相似文献