首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
一个有效的核方法通常取决于选择一个合适的核函数。目前研究核方法的热点是从数据中自动地进行核学习。提出基于最优分类标准的核学习方法,这个标准类似于线性鉴别分析和核Fisher判别式。并把此算法应用于模糊支持向量机多类分类器设计上,在ORL人脸数据集和Iris数据集上的实验验证了该算法的可行性。  相似文献   

2.
In this paper, the method of kernel Fisher discriminant (KFD) is analyzed and its nature is revealed, i.e., KFD is equivalent to kernel principal component analysis (KPCA) plus Fisher linear discriminant analysis (LDA). Based on this result, a more transparent KFD algorithm is proposed. That is, KPCA is first performed and then LDA is used for a second feature extraction in the KPCA-transformed space. Finally, the effectiveness of the proposed algorithm is verified using the CENPARMI handwritten numeral database.  相似文献   

3.
This paper develops a generalized nonlinear discriminant analysis (GNDA) method and deals with its small sample size (SSS) problems. GNDA is a nonlinear extension of linear discriminant analysis (LDA), while kernel Fisher discriminant analysis (KFDA) can be regarded as a special case of GNDA. In LDA, an under sample problem or a small sample size problem occurs when the sample size is less than the sample dimensionality, which will result in the singularity of the within-class scatter matrix. Due to a high-dimensional nonlinear mapping in GNDA, small sample size problems arise rather frequently. To tackle this issue, this research presents five different schemes for GNDA to solve the SSS problems. Experimental results on real-world data sets show that these schemes for GNDA are very effective in tackling small sample size problems.  相似文献   

4.
为了提高高光谱遥感影像的分类精度,充分利用影像的光谱和局部信息,文中提出小波核局部Fisher判别分析的高光谱遥感影像特征提取方法.通过小波核函数将数据集从低维原始空间映射至高维特征空间,考虑到数据的局部信息,利用加权矩阵计算散度矩阵,对局部Fisher判别准则函数求解最优特征矩阵,使不同类别的样本在高维特征空间中的可分离性更佳.在2个公开高光谱数据集上的实验表明,文中方法的总体分类精度和Kappa系数都有所提高.  相似文献   

5.
核学习机研究   总被引:2,自引:2,他引:2  
该文概述了近年来机器学习研究领域的一个热点问题———核学习机。首先分析了核方法的主要思想,然后着重介绍了几种新近发展的核学习机,包括支持向量机、核的Fisher判别分析等有监督学习算法及核的主分量分析等无监督学习算法,最后讨论了其应用及前景展望。  相似文献   

6.
一种新的核线性鉴别分析算法及其在人脸识别上的应用   总被引:1,自引:0,他引:1  
基于核策略的核Fisher鉴别分析(KFD)算法已成为非线性特征抽取的最有效方法之一。但是先前的基于核Fisher鉴别分析算法的特征抽取过程都是基于2值分类问题而言的。如何从重叠(离群)样本中抽取有效的分类特征没有得到有效的解决。本文在结合模糊集理论的基础上,利用模糊隶属度函数的概念,在特征提取过程中融入了样本的分布信息,提出了一种新的核Fisher鉴别分析方法——模糊核鉴别分析算法。在ORL人脸数据库上的实验结果验证了该算法的有效性。  相似文献   

7.
Kernel Fisher discriminant analysis (KFDA) extracts a nonlinear feature from a sample by calculating as many kernel functions as the training samples. Thus, its computational efficiency is inversely proportional to the size of the training sample set. In this paper we propose a more approach to efficient nonlinear feature extraction, FKFDA (fast KFDA). This FKFDA consists of two parts. First, we select a portion of training samples based on two criteria produced by approximating the kernel principal component analysis (AKPCA) in the kernel feature space. Then, referring to the selected training samples as nodes, we formulate FKFDA to improve the efficiency of nonlinear feature extraction. In FKFDA, the discriminant vectors are expressed as linear combinations of nodes in the kernel feature space, and the extraction of a feature from a sample only requires calculating as many kernel functions as the nodes. Therefore, the proposed FKFDA has a much faster feature extraction procedure compared with the naive kernel-based methods. Experimental results on face recognition and benchmark datasets classification suggest that the proposed FKFDA can generate well classified features.  相似文献   

8.
提出了基于核诱导距离度量的鲁棒判别分析算法(robust discriminant analysis based on kernel-induced distance measure,KI-RDA)。KI-RDA不仅自然地推广了线性判别分析(linear discriminant analysis,LDA),而且推广了最近提出的强有力的基于非参数最大熵的鲁棒判别分析(robust discriminant analysis based on nonparametric maximum entropy,MaxEnt-RDA)。通过采用鲁棒径向基核,KI-RDA不仅能有效处理含噪数据,而且也适合处理非高斯分布的非线性数据,其本质的鲁棒性归咎于KI-RDA通过核诱导的非欧距离代替LDA的欧氏距离来刻画类间散度和类内散度。借助这些散度,为特征提取定义类似LDA的判别准则,导致了相应的非线性优化问题。进一步借助近似策略,将优化问题转化为直接可解的广义特征值问题,由此获得降维变换(矩阵)的闭合解。最后在多类数据集上进行实验,验证了KI-RDA的有效性。由于核的多样性,使KI-RDA事实上成为了一个一般性判别分析框架。  相似文献   

9.
常用Fisher判别函数的判别矩阵研究   总被引:3,自引:0,他引:3  
程正东  章毓晋  樊祥  朱斌 《自动化学报》2010,36(10):1361-1370
在线性判别分析(Linear discriminant analysis, LDA)中, 比迹函数、比值函数和迹比函数是三种常用的Fisher判别函数, 每一个判别函数都可得到一个正交判别(Orthogonal discriminant, OD)矩阵和一个不相关判别(Uncorrelated discriminant, UD)矩阵. 本文的主要目的是对这6种判别矩阵的获取方法及其性质进行系统分析, 拟期更清楚地认识它们的联系与区别. 当类内协方差阵非奇异时, 比迹、比值函数的判别矩阵和迹比函数的OD矩阵的获取方法及性质已有研究, 本文对迹比函数的UD矩阵的获取方法及性质进行了补充研究, 得到了迹比函数的UD矩阵与比迹、比值函数的UD矩阵是同一矩阵以及迹比函数的UD矩阵的判别函数值不超过它的OD矩阵的结论. 当类内协方差阵奇异时, 6种判别矩阵的获取方法遇到了困难, 为克服这一困难, 本文首先用极限的思想重新定义了这三种判别函数, 然后采用求极限的方法得到了6种判别矩阵的获取方法. 从所得的获取方法可以看出, 当所需的判别向量均在类内协方差阵的零空间中时, 6个判别矩阵是同一矩阵.  相似文献   

10.
Majid M.  Andreas 《Neurocomputing》2008,71(7-9):1238-1247
In many applications, one is interested to detect certain patterns in random process signals. We consider a class of random process signals which contain sub-similarities at random positions representing the texture of an object. Those repetitive parts may occur in speech, musical pieces and sonar signals. We suggest a warped time-resolved spectrum kernel for extracting the subsequence similarity in time series in general, and as an example in biosonar signals. Having a set of those kernels for similarity extraction in different size of subsequences, we propose a new method to find an optimal linear combination of those kernels. We formulate the optimal kernel selection via maximizing the kernel Fisher discriminant (KFD) criterion and use Mesh Adaptive Direct Search (MADS) method to solve the optimization problem. Our method is used for biosonar landmark classification with promising results.  相似文献   

11.
抽取最佳鉴别特征是人脸识别中的重要一步。对小样本的高维人脸图像样本,由于各种抽取非线性鉴别特征的方法均存在各自的问题,为此提出了一种求解核的Fisher非线性最佳鉴别特征的新方法,该方法首先在特征空间用类间散度阵和类内散度阵作为Fisher准则,来得到最佳非线性鉴别特征,然后针对此方法存在的病态问题,进一步在类内散度阵的零空间中求解最佳非线性鉴别矢量。基于ORL人脸数据库的实验表明,该新方法抽取的非线性最佳鉴别特征明显优于Fisher线性鉴别分析(FLDA)的线性特征和广义鉴别分析(GDA)的非线性特征。  相似文献   

12.
Kernel principal component analysis (KPCA) and kernel linear discriminant analysis (KLDA) are two commonly used and effective methods for dimensionality reduction and feature extraction. In this paper, we propose a KLDA method based on maximal class separability for extracting the optimal features of analog fault data sets, where the proposed KLDA method is compared with principal component analysis (PCA), linear discriminant analysis (LDA) and KPCA methods. Meanwhile, a novel particle swarm optimization (PSO) based algorithm is developed to tune parameters and structures of neural networks jointly. Our study shows that KLDA is overall superior to PCA, LDA and KPCA in feature extraction performance and the proposed PSO-based algorithm has the properties of convenience of implementation and better training performance than Back-propagation algorithm. The simulation results demonstrate the effectiveness of these methods.  相似文献   

13.
提出了一种基于低密度分割几何距离的半监督KFDA(kernel Fisher discriminant analysis)算法(semisupervised KFDA based on low density separation geometry distance,简称SemiGKFDA).该算法以低密度分割几何距离作为相似性度量,通过大量无标签样本,提高KFDA算法的泛化能力.首先,利用核函数将原始空间样本数据映射到高维特征空间中;然后,通过有标签样本和无标签样本构建低密度分割几何距离测度上的内蕴结构一致性假设,使其作为正则化项整合到费舍尔判别分析的目标函数中;最后,通过求解最小化目标函数获得最优投影矩阵.人工数据集和UCI数据集上的实验表明,该算法与KFDA及其改进算法相比,在分类性能上有显著提高.此外,将该算法与其他算法应用到人脸识别问题中进行对比,实验结果表明,该算法具有更高的识别精度.  相似文献   

14.
A reformative kernel Fisher discriminant method is proposed, which is directly derived from the naive kernel Fisher discriminant analysis with superiority in classification efficiency. In the novel method only a part of training patterns, called “significant nodes”, are necessary to be adopted in classifying one test pattern. A recursive algorithm for selecting “significant nodes”, which is the key of the novel method, is presented in detail. The experiment on benchmarks shows that the novel method is effective and much efficient in classifying.  相似文献   

15.
This paper examines the theory of kernel Fisher discriminant analysis (KFD) in a Hilbert space and develops a two-phase KFD framework, i.e., kernel principal component analysis (KPCA) plus Fisher linear discriminant analysis (LDA). This framework provides novel insights into the nature of KFD. Based on this framework, the authors propose a complete kernel Fisher discriminant analysis (CKFD) algorithm. CKFD can be used to carry out discriminant analysis in "double discriminant subspaces." The fact that, it can make full use of two kinds of discriminant information, regular and irregular, makes CKFD a more powerful discriminator. The proposed algorithm was tested and evaluated using the FERET face database and the CENPARMI handwritten numeral database. The experimental results show that CKFD outperforms other KFD algorithms.  相似文献   

16.
Linear discriminant analysis (LDA) is a simple but widely used algorithm in the area of pattern recognition. However, it has some shortcomings in that it is sensitive to outliers and limited to linearly separable cases. To solve these problems, in this paper, a non-linear robust variant of LDA, called robust kernel fuzzy discriminant analysis (RKFDA) is proposed. RKFDA uses fuzzy memberships to reduce the effect of outliers and adopts kernel methods to accommodate non-linearly separable cases. There have been other attempts to solve the problems of LDA, including attempts using kernels. However, RKFDA, encompassing previous methods, is the most general one. Furthermore, theoretical analysis and experimental results show that RKFDA is superior to other existing methods in solving the problems.  相似文献   

17.
Fisher's linear discriminant analysis (LDA) is popular for dimension reduction and extraction of discriminant features in many pattern recognition applications, especially biometric learning. In deriving the Fisher's LDA formulation, there is an assumption that the class empirical mean is equal to its expectation. However, this assumption may not be valid in practice. In this paper, from the “perturbation” perspective, we develop a new algorithm, called perturbation LDA (P-LDA), in which perturbation random vectors are introduced to learn the effect of the difference between the class empirical mean and its expectation in Fisher criterion. This perturbation learning in Fisher criterion would yield new forms of within-class and between-class covariance matrices integrated with some perturbation factors. Moreover, a method is proposed for estimation of the covariance matrices of perturbation random vectors for practical implementation. The proposed P-LDA is evaluated on both synthetic data sets and real face image data sets. Experimental results show that P-LDA outperforms the popular Fisher's LDA-based algorithms in the undersampled case.  相似文献   

18.
基于核化原理的非线性典型相关判别分析   总被引:4,自引:0,他引:4  
典型相关判别分析是将传统的典型相关分析应用于判别问题,它是一类重要的特征提取算法,但其本质上只能提取数据的线性特征,应用统计学习理论中的核化原理可以将这样的线性特征提取算法推广至非线性特征提取算法,该文研究了如何将这一原理应用于典型相关判别分析,提出了基于核化原理的非线性典型相关判别分析,并且给出了求解该问题的一个自适应学习算法.数值实验表明,基于核化原理所导出的非线性典型相关判别分析比传统的典型相关判别分析更有效,另外,该文从理论上证明,所提出的新方法与Fisher核判别分析等价。  相似文献   

19.
A novel fuzzy nonlinear classifier, called kernel fuzzy discriminant analysis (KFDA), is proposed to deal with linear non-separable problem. With kernel methods KFDA can perform efficient classification in kernel feature space. Through some nonlinear mapping the input data can be mapped implicitly into a high-dimensional kernel feature space where nonlinear pattern now appears linear. Different from fuzzy discriminant analysis (FDA) which is based on Euclidean distance, KFDA uses kernel-induced distance. Theoretical analysis and experimental results show that the proposed classifier compares favorably with FDA.  相似文献   

20.
This paper addresses the problem of automatically tuning multiple kernel parameters for the kernel-based linear discriminant analysis (LDA) method. The kernel approach has been proposed to solve face recognition problems under complex distribution by mapping the input space to a high-dimensional feature space. Some recognition algorithms such as the kernel principal components analysis, kernel Fisher discriminant, generalized discriminant analysis, and kernel direct LDA have been developed in the last five years. The experimental results show that the kernel-based method is a good and feasible approach to tackle the pose and illumination variations. One of the crucial factors in the kernel approach is the selection of kernel parameters, which highly affects the generalization capability and stability of the kernel-based learning methods. In view of this, we propose an eigenvalue-stability-bounded margin maximization (ESBMM) algorithm to automatically tune the multiple parameters of the Gaussian radial basis function kernel for the kernel subspace LDA (KSLDA) method, which is developed based on our previously developed subspace LDA method. The ESBMM algorithm improves the generalization capability of the kernel-based LDA method by maximizing the margin maximization criterion while maintaining the eigenvalue stability of the kernel-based LDA method. An in-depth investigation on the generalization performance on pose and illumination dimensions is performed using the YaleB and CMU PIE databases. The FERET database is also used for benchmark evaluation. Compared with the existing PCA-based and LDA-based methods, our proposed KSLDA method, with the ESBMM kernel parameter estimation algorithm, gives superior performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号