首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
感兴趣区域定位是提取目标特征,进行目标识别与跟踪等后续处理的重要基础.由于大尺寸遥感图像的光谱特性和目标形状均很复杂,通常采用的基于光谱特征的分割方法和基于边缘的区域生长技术不合适,从模式分类角度考虑遥感图像中感兴趣区域快速定位问题,提出一种基于决策二叉树支持向量机的纹理分类方法,将分类器分布在各个结点上,构成了多类支持向量机,减少了分类器数量和重复训练样本的数量.在SPOT图像上的实验结果表明,该方法实现感兴趣区域的快速定位有较高的分类正确率.  相似文献   

2.
针对火灾图像纹理识别问题,提出了基于Gabor小波变换的ICA火灾图像纹理识别算法,并根据火灾图像纹理识别特点进行了优化;首先用不同尺度和方向的Gabor滤波器对待识别图像滤波,得到其特征图像,然后将特征图像转化成特征向量作为ICA的输入,得到基矢量子空间,再将测试图像经过Gabor滤波器的特征向量投影到ICA子空间中得到系数向量作为目标识别特征,最后用支持向量机进行识别;通过与Gabor滤波器法和ICA方法的对比实验,表明该算法可以在火灾纹理图像的识别率上比传统方法提高5%以上,为火灾图像识别提供了一种新思路.  相似文献   

3.
基于SVM算法和纹理特征提取的遥感图像分类   总被引:3,自引:0,他引:3  
遥感图像分类是遥感图像处理领域中的一个重要的研究方向,传统的遥感图像分类方法根据像素值进行分类,忽视了遥感影像中丰富的纹理特征信息.小波分析通过引入宽度可变的窗口,可以同时对信号的局部信息进行频率域和时间域的变换.小波分析算法可以有效地提取出图像中的纹理特征信息.支持向量机算法是20世纪90年代提出的一种新的机器学习算法,通常被用来进行模式识别和分类.结合小波纹理提取算法,利用支持向量机进行遥感图像分类.研究结果表明,结合纹理特征的支持向量机分类的效果优于直接对灰度图像进行分类.  相似文献   

4.
提出一种基于极值加权平均分数维特征提取和支持向量机分类器识别的虹膜识别方法.利用形态学和圆形边缘检测算子定位虹膜,并将虹膜纹理映射到极坐标空间;定义了一种新的图像分敷维--极值加权平均分数维,用于提取虹膜特征;利用支持向量机分类器对虹膜特征矩阵进行匹配识别.试验表明,基于极值加权平均分数维特征提取和支持向量机分类嚣识别的虹膜识别系统识别率高,速度快.  相似文献   

5.
针对可见光航空遥感监测中,耀斑和云阴影等强噪音的干扰使水中目标很难直接发现,该文提出一种基于方向傅立叶能量谱和支持向量机的水面尾迹纹理自动提取算法,通过提取运动产生的尾迹实现对它们的准确识别。该算法将图像划分成大小相等的子图像,求取子图像的傅立叶能量谱,通过对传统的主成分分析进行改进求得子图像的纹理方向,根据纹理方向将能量谱划分为20个区域,将每个区域的能量谱和作为纹理特征。该纹理特征具有平移和旋转不变性等优点,并使用支持向量机作为分类器。实验结果表明该方法能够准确地提取运动目标产生的尾迹纹理。  相似文献   

6.
基于支持向量机的遥感图像舰船目标识别方法   总被引:2,自引:0,他引:2  
李毅  徐守时 《计算机仿真》2006,23(6):180-183
针对高分辨率遥感图像舰船目标识别问题,提出了一种基于支持向量机的舰船目标分类方法。支持向量机(SVM)是一类新型机器学习方法,基于结构风险最小化归纳原则,具有出色的学习能力。与传统的方法相比,支持向量机不但结构简单,而且技术性能特别是泛化能力明显提高。该文简要介绍了有关统计学习理论和支持向量机算法,将支持向量机应用于遥感图像舰船目标识别,并同传统的舰船识别方法进行了相关的对比实验,实验结果说明本文提出的分类器在识别性能上明显优于其它传统分类器,具有更高的识别性能率。  相似文献   

7.
基于Wold模型和支持向量机的纹理识别   总被引:1,自引:0,他引:1  
提出一种基于Wold模型和支持向量机的纹理识别新方法,有效解决了方向和尺度变化给纹理识别带来的困难.该方法首先对纹理图像进行傅里叶变换和自适应谱分解,将确定域功率谱的扇形区域能量和环形区域能量分布参数作为纹理扩展特征.然后,利用能量分布特征把纹理的主方向旋转到0°,提取旋转后图像的共生矩阵和小波变换统计参数作为基本纹理特征.在两组分别包含25类单色自然纹理的图像库上进行的识别实验表明,该方法获得了良好的识别效果.  相似文献   

8.
支持向量机和水平集的高分辨率遥感图像河流检测   总被引:2,自引:1,他引:1       下载免费PDF全文
河流是重要的地理结构特征,对河流进行检测识别研究,在军事上和民用上都具有十分重要的意义.提出了一种基于支持向量机(SVM)和水平集的高分辨率遥感图像河流检测算法.首先根据高分辨率遥感图像河流目标的特点,采用样本图像的纹理特征和基准点信息扩散特征构造特征向量,并基于样本训练支持向量机分类器实现河流目标的粗分割;然后以粗分割结果为基础,采用距离正则化水平集演化(DRLSE)模型提取河流的精确轮廓,获得完整的河流区域.以1 m分辨率的IKONOS图像进行实验验证,结果表明本文算法准确性高,灵活性强,可以在复杂背景下准确地检测河流目标区域,在实践中具有广泛适用性.  相似文献   

9.
为提高煤矸石的自动识别和分选效率,提出了基于支持向量机(SVM)和纹理识别煤矸石的方法.选取两种煤和一种煤矸石的图像作为样本,经过图像预处理及图像灰度和纹理特征分析后,发现灰度均值、灰度共生矩阵最大值、二阶矩、对比度、相关、熵为有效特征.在此基础上,采用了支持向量机来完成图像的自动识别过程,选取上述6个参数作为支持向量机的训练特征,实验结果表明,该支持向量机识别煤和煤矸石的成功率较高.  相似文献   

10.
基于单类SVM的遥感图像目标检测   总被引:4,自引:0,他引:4  
传统支持向量机方法在正负样本不对称的情况下对遥感图像的目标检测存在一定的误检率,文章将单类SVM方法引入此类目标检测过程中。实验表明单类SVM在牺牲少量泛化性的同时能有效地降低误检率,并提高检测速度。  相似文献   

11.
Texture classification using the support vector machines   总被引:12,自引:0,他引:12  
Shutao  James T.  Hailong  Yaonan 《Pattern recognition》2003,36(12):2883-2893
In recent years, support vector machines (SVMs) have demonstrated excellent performance in a variety of pattern recognition problems. In this paper, we apply SVMs for texture classification, using translation-invariant features generated from the discrete wavelet frame transform. To alleviate the problem of selecting the right kernel parameter in the SVM, we use a fusion scheme based on multiple SVMs, each with a different setting of the kernel parameter. Compared to the traditional Bayes classifier and the learning vector quantization algorithm, SVMs, and, in particular, the fused output from multiple SVMs, produce more accurate classification results on the Brodatz texture album.  相似文献   

12.
基于RBF核的SVM的模型选择及其应用   总被引:22,自引:2,他引:22  
使用RBF核的SVM(支持向量机)被广泛应用于模式识别中。此类SVM的模型选择取决于两个参数,其一是惩罚因子C,其二是核参数σ2。该文使用了网格搜索和双线性搜索两种方法进行参数选择,并将两者的优点综合,应用于脱机手写体英文字符识别。实验在NIST数据集上进行了验证,对搜索效率和推广识别率进行了比较。实验结果还表明使用最优参数的SVM在识别率上比使用ANN(人工神经元网络)的分类器有较大提高。  相似文献   

13.
Gaussian mixture model (GMM) based approaches have been commonly used for speaker recognition tasks. Methods for estimation of parameters of GMMs include the expectation-maximization method which is a non-discriminative learning based method. Discriminative classifier based approaches to speaker recognition include support vector machine (SVM) based classifiers using dynamic kernels such as generalized linear discriminant sequence kernel, probabilistic sequence kernel, GMM supervector kernel, GMM-UBM mean interval kernel (GUMI) and intermediate matching kernel. Recently, the pyramid match kernel (PMK) using grids in the feature space as histogram bins and vocabulary-guided PMK (VGPMK) using clusters in the feature space as histogram bins have been proposed for recognition of objects in an image represented as a set of local feature vectors. In PMK, a set of feature vectors is mapped onto a multi-resolution histogram pyramid. The kernel is computed between a pair of examples by comparing the pyramids using a weighted histogram intersection function at each level of pyramid. We propose to use the PMK-based SVM classifier for speaker identification and verification from the speech signal of an utterance represented as a set of local feature vectors. The main issue in building the PMK-based SVM classifier is construction of a pyramid of histograms. We first propose to form hard clusters, using k-means clustering method, with increasing number of clusters at different levels of pyramid to design the codebook-based PMK (CBPMK). Then we propose the GMM-based PMK (GMMPMK) that uses soft clustering. We compare the performance of the GMM-based approaches, and the PMK and other dynamic kernel SVM-based approaches to speaker identification and verification. The 2002 and 2003 NIST speaker recognition corpora are used in evaluation of different approaches to speaker identification and verification. Results of our studies show that the dynamic kernel SVM-based approaches give a significantly better performance than the state-of-the-art GMM-based approaches. For speaker recognition task, the GMMPMK-based SVM gives a performance that is better than that of SVMs using many other dynamic kernels and comparable to that of SVMs using state-of-the-art dynamic kernel, GUMI kernel. The storage requirements of the GMMPMK-based SVMs are less than that of SVMs using any other dynamic kernel.  相似文献   

14.
In this paper, we develop a diagnosis model based on particle swarm optimization (PSO), support vector machines (SVMs) and association rules (ARs) to diagnose erythemato-squamous diseases. The proposed model consists of two stages: first, AR is used to select the optimal feature subset from the original feature set; then a PSO based approach for parameter determination of SVM is developed to find the best parameters of kernel function (based on the fact that kernel parameter setting in the SVM training procedure significantly influences the classification accuracy, and PSO is a promising tool for global searching). Experimental results show that the proposed AR_PSO–SVM model achieves 98.91% classification accuracy using 24 features of the erythemato-squamous diseases dataset taken from UCI (University of California at Irvine) machine learning database. Therefore, we can conclude that our proposed method is very promising compared to the previously reported results.  相似文献   

15.
Business failure prediction (BFP) is an effective tool to help financial institutions and relevant people to make the right decision in investments, especially in the current competitive environment. This topic belongs to a classification-type task, one of whose aims is to generate more accurate hit ratios. Support vector machine (SVM) is a statistical learning technique, whose advantage is its high generalization performance. The objective of this context is threefold. Firstly, SVM is used to predict business failure by utilizing a straightforward wrapper approach to help the model produce more accurate prediction. The wrapper approach is fulfilled by employing a forward feature selection method, composed of feature ranking and feature selection. Meanwhile, this work attempts to investigate the feasibility of using linear SVMs to select features for all SVMs in the wrapper since non-linear SVMs yield to over-fit the data. Finally, a robust re-sampling approach is used to evaluate model performances for the task of BFP in China. In the empirical research, performances of linear SVM, polynomial SVM, Gaussian SVM, and sigmoid SVM with the best filter of stepwise MDA, and wrappers respectively using linear SVM and non-linear SVMs as evaluating functions are to be compared. The results indicate that the non-linear SVM with radial basis function kernel and features selected by linear SVM compare significantly superiorly to all the other SVMs. Meanwhile, all SVMs with features selected by linear SVM produce at least as good performances as SVMs with other optimal features.  相似文献   

16.
First, the all-important no free lunch theorems are introduced. Next, kernel methods, support vector machines (SVMs), preprocessing, model selection, feature selection, SVM software and the Fisher kernel are introduced and discussed. A hidden Markov model is trained on foreign exchange data to derive a Fisher kernel for an SVM, the DC algorithm and the Bayes point machine (BPM) are also used to learn the kernel on foreign exchange data. Further, the DC algorithm was used to learn the parameters of the hidden Markov model in the Fisher kernel, creating a hybrid algorithm. The mean net returns were positive for BPM; and BPM, the Fisher kernel, the DC algorithm and the hybrid algorithm were all improvements over a standard SVM in terms of both gross returns and net returns, but none achieved net returns as high as the genetic programming approach employed by Neely, Weller, and Dittmar (1997) and published in Neely, Weller, and Ulrich (2009). Two implementations of SVMs for Windows with semi-automated parameter selection are built.  相似文献   

17.
Asymptotic behaviors of support vector machines with Gaussian kernel   总被引:97,自引:0,他引:97  
Keerthi SS  Lin CJ 《Neural computation》2003,15(7):1667-1689
Support vector machines (SVMs) with the gaussian (RBF) kernel have been popular for practical use. Model selection in this class of SVMs involves two hyperparameters: the penalty parameter C and the kernel width sigma. This letter analyzes the behavior of the SVM classifier when these hyperparameters take very small or very large values. Our results help in understanding the hyperparameter space that leads to an efficient heuristic method of searching for hyperparameter values with small generalization errors. The analysis also indicates that if complete model selection using the gaussian kernel has been conducted, there is no need to consider linear SVM.  相似文献   

18.
支持向量机(SVM)是最为流行的分类工具,但处理大规模的数据集时,需要大量的内存资源和训练时间,通常在大集群并行环境下才能实现。提出一种新的并行SVM算法,RF-CCASVM,可在有限计算资源上求解大规模SVM。通过随机傅里叶映射,应用低维显示特征映射一致近似高斯核对应的无限维隐式特征映射,从而用线性SVM一致近似高斯核SVM。提出一致中心调节的并行化方法。具体地,将数据集划分成若干子数据集,多个进程并行地在各自的子数据集上独立训练SVM。当各个子数据集上的最优超平面即将求出时,用由各个子集上获得的一致中心解取代当前解,继续在各子集上训练直到一致中心解在各个子集上达到最优。标准数据集的对比实验验证了RF-CCASVM的正确性和有效性。  相似文献   

19.
The Support Vector Machines (SVM) constitute a very powerful technique for pattern classification problems. However, its efficiency in practice depends highly on the selection of the kernel function type and relevant parameter values. Selecting relevant features is another factor that can also impact the performance of SVM. The identification of the best set of parameters values for a classification model such as SVM is considered as an optimization problem. Thus, in this paper, we aim to simultaneously optimize SVMs parameters and feature subset using different kernel functions. We cast this problem as a multi-objective optimization problem, where the classification accuracy, the number of support vectors, the margin and the number of selected features define our objective functions. To solve this optimization problem, a method based on multi-objective genetic algorithm NSGA-II is suggested. A multi-criteria selection operator for our NSGA-II is also introduced. The proposed method is tested on some benchmark data-sets. The experimental results show the efficiency of the proposed method where features were reduced and the classification accuracy has been improved.  相似文献   

20.
The evidence framework applied to support vector machines   总被引:23,自引:0,他引:23  
We show that training of the support vector machine (SVM) can be interpreted as performing the level 1 inference of MacKay's evidence framework (1992). We further on show that levels 2 and 3 of the evidence framework can also be applied to SVMs. This integration allows automatic adjustment of the regularization parameter and the kernel parameter to their near-optimal values. Moreover, it opens up a wealth of Bayesian tools for use with SVMs. Performance of this method is evaluated on both synthetic and real-world data sets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号