首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 171 毫秒
1.
利用标准化LDA进行人脸识别   总被引:13,自引:0,他引:13  
线性判别分析(LDA)是一种较为普遍的用于特征提取的线性分类方法。提出一种基于LDA的人脸识别方法--标准化LDA,该方法克服了传统LDA方法的缺点,重新定义了样本类间离散度矩阵,在原始定义的基础上增加一个由类间距离决定的可变权函数,使得在选择投地,能够更好地分开各个类的样本;同时,它采用一种合理而有效的方法解决矩阵奇异的问题,即保留样本类内离散度矩阵的零空间,因为这个空间包含了最具有判别能力的信息。在这个零空间里,寻找对应于样本类间离散度矩阵的较大特征值的特征向量作为最后降维的转换矩阵。实验结果显示,在人脸识别中,与传统LDA相比,该方法有更好的识别率。标准化LDA也可以用于其他图像识别问题。  相似文献   

2.
结合零空间法和F-LDA的人脸识别算法   总被引:2,自引:0,他引:2  
王增锋  王汇源  冷严 《计算机应用》2005,25(11):2586-2588
线性判别分析(LDA)是一种常用的线性特征提取方法。传统LDA应用于人脸识别时主要存在两个问题:1)小样本问题,即由于训练样本不足引起矩阵奇异; 2)优化准则函数并不直接与识别率相关。提出了一种新的能同时解决以上两个问题的基于LDA的人脸识别算法。首先,通过重新定义样本的类内散布矩阵和类间散布矩阵,提出了一种新的零空间法。然后把这种新的零空间法与F LDA(Fractional LDA)算法相结合,得到一种对人脸识别更有效的特征提取方法。实验结果表明,这种新算法具有较高的识别率。  相似文献   

3.
一种改进的线性判别分析算法MLDA   总被引:1,自引:0,他引:1  
刘忠宝  王士同 《计算机科学》2010,37(11):239-242
线性判别分析(LDA)是模式识别方法之一,已广泛应用于模式识别、数据分析等诸多领域。线性判别分析法寻找的是有效分类的方向。而当样本维数远大于样本个数(即小样本问题)时,LDA便束手无策。为有效解决线性判别分析法的小样本问题,提出了一种改进的LDA算法——MLDA。该算法将类内离散度矩阵进行标量化处理,有效地避免了对类内离散度矩阵求逆。通过实验证明MLDA在一定程度上解决了经典LDA的小样本问题。  相似文献   

4.
线性判别分析(LDA)是一种普遍用于特征提取的线性分类方法。但将LDA直接用于人脸识别会遇到小样本问题和秩限制问题。为了解决以上问题,提出一种基于多阶矩阵组合的LDA算法——MLDA。该算法重新定义了传统LDA中的类内离散度矩阵Sw,使传统Fisher准则具有更好的健壮性和适应性。若干人脸数据库上的比较实验证明了MLDA的有效性。  相似文献   

5.
基于改进LDA算法的人脸识别   总被引:1,自引:0,他引:1  
提出一种基于改进LDA的人脸识别算法,该算法克服传统LDA算法的缺点,重新定义样本类间离散度矩阵和Fisher准则,从而保留住最有辨别力的信息,增强算法的识别率.实验结果证明该算法是可行的,与传统的PCA LDA算法比较,具有较高的识别率.  相似文献   

6.
线性鉴别分析(LDA)小样本问题的已有解决方法在构造最优投影子空间时未完整利用LDA的4个信息空间,为此,提出一种基于二维主成分分析(2D-PCA)的两级LDA人脸识别方法。采用减法运算对样本类内散度矩阵和类间散度矩阵的特征值矩阵求逆,以解决小样本问题,并连续应用Fisher准则和修改后的Fisher准则连接2个投影子空间,获取包含LDA的4个信息空间的最优投影方向,利用2D-PCA对输入样本做预处理,以减少计算复杂度。在ORL和YALE人脸库上的实验结果表明,该方法虽然训练时间略有增加,但识别率分别为92.5%和95.8%,优于其他常用LDA算法。  相似文献   

7.
运用小波进行图像分解提取低频子带图,并利用优化的线性判别分析(LDA)算法寻找最优投影子空间,从而映射提取人脸特征,实现人脸的分类识别。该方法避免了传统LDA算法中类内离散度矩阵非奇异的要求,解决了边缘类重叠问题,具有更广泛的应用空间。实验表明:该方法优于传统的LDA方法和主分量分析(PCA)方法。  相似文献   

8.
刘敬  张军英  赵峰 《控制与决策》2007,22(11):1250-1254
针对非参数线性判别分析(LDA)的类间散布矩阵,就如何有效描述类边界结构这一问题,提出一种SVM与k近邻(kNN)法相结合的非参数类间散布矩阵构造方法——SVM—kNN.该方法消除了非类边界样本对类边界结构信息的扭曲.将SVM—kNN非参数LDA方法用于外场实测高分辨距离像的特征提取,并将识别结果与加权kNN非参数LDA法和谱域原空间法比较,结果表明,SVM—kNN非参数LDA方法能显著提高识别效率.  相似文献   

9.
线性判别分析(LDA)是一种常用的特征提取方法,其目标是提取特征后样本的类间离散度和类内离散度的比值最大,即各类样本在特征空间中有最佳的可分离性.该方法利用同一个准则将所有类的样本投影到同一个特征空间中,忽略了各类样本分布特征的差异.本文提出类依赖的线性判别方法(Class-Specific LDA,CSLDA),对每一类样本寻找最优的投影矩阵,使得投影后能够更好地把该类样本与所有其他类的样本尽可能分开,并将该方法与经验核相结合,得到经验核空间中类依赖的线性判别分析.在人工数据集和UCI数据集上的实验结果表明,在输入空间和经验核空间里均有CSLDA特征提取后的识别率高于LDA.  相似文献   

10.
线性判别分析(LDA)是模式识别领域的一个经典方法,但是LDA难以克服小样本问题。针对LDA的小样本问题,提出一种双曲余弦矩阵鉴别分析方法(HCDA)。该方法首先给出了双曲余弦矩阵函数的定义及其特征系统,再利用双曲余弦矩阵函数特征系统的特点,将其引入Fisher准则中进行特征提取。HCDA有两方面的优势:a)避免了小样本问题,可以提取更多的鉴别信息;b)HCDA方法隐含了一个非线性映射。该映射具有扩大样本间距离的作用,并且对不同类别样本间距离的扩大尺度要大于同类别样本间距离的扩大尺度,从而更有利于模式分类。在手写数字库、手写字母图像库和Georgia Tech人脸图像库上的实验结果表明,相对于具有代表性的解决LDA小样本问题的方法,HCDA具有更好的识别性能。  相似文献   

11.
Cancer classification is one of the major applications of the microarray technology. When standard machine learning techniques are applied for cancer classification, they face the small sample size (SSS) problem of gene expression data. The SSS problem is inherited from large dimensionality of the feature space (due to large number of genes) compared to the small number of samples available. In order to overcome the SSS problem, the dimensionality of the feature space is reduced either through feature selection or through feature extraction. Linear discriminant analysis (LDA) is a well-known technique for feature extraction-based dimensionality reduction. However, this technique cannot be applied for cancer classification because of the singularity of the within-class scatter matrix due to the SSS problem. In this paper, we use Gradient LDA technique which avoids the singularity problem associated with the within-class scatter matrix and shown its usefulness for cancer classification. The technique is applied on three gene expression datasets; namely, acute leukemia, small round blue-cell tumour (SRBCT) and lung adenocarcinoma. This technique achieves lower misclassification error as compared to several other previous techniques.  相似文献   

12.
Linear discriminant analysis (LDA) is one of the most effective feature extraction methods in statistical pattern recognition, which extracts the discriminant features by maximizing the so-called Fisher’s criterion that is defined as the ratio of between-class scatter matrix to within-class scatter matrix. However, classification of high-dimensional statistical data is usually not amenable to standard pattern recognition techniques because of an underlying small sample size (SSS) problem. A popular approach to the SSS problem is the removal of non-informative features via subspace-based decomposition techniques. Motivated by this viewpoint, many elaborate subspace decomposition methods including Fisherface, direct LDA (D-LDA), complete PCA plus LDA (C-LDA), random discriminant analysis (RDA) and multilinear discriminant analysis (MDA), etc., have been developed, especially in the context of face recognition. Nevertheless, how to search a set of complete optimal subspaces for discriminant analysis is still a hot topic of research in area of LDA. In this paper, we propose a novel discriminant criterion, called optimal symmetrical null space (OSNS) criterion that can be used to compute the Fisher’s maximal discriminant criterion combined with the minimal one. Meanwhile, by the reformed criterion, the complete symmetrical subspaces based on the within-class and between-class scatter matrices are constructed, respectively. Different from the traditional subspace learning criterion that derives only one principal subspace, in our approach two null subspaces and their orthogonal complements were all obtained through the optimization of OSNS criterion. Therefore, the algorithm based on OSNS has the potential to outperform the traditional LDA algorithms, especially in the cases of small sample size. Experimental results conducted on the ORL, FERET, XM2VTS and NUST603 face image databases demonstrate the effectiveness of the proposed method.  相似文献   

13.
最大散度差鉴别分析及人脸识别   总被引:16,自引:3,他引:13  
传统的Fisher线性鉴别分析(LDA)在人脸等高维图像识别应用中不可避免地遇到小样本问题。提出一种基于散度差准则的鉴别分析方法。与LDA方法不同的是,该方法利用样本模式的类间散布与类内散布之差而不是它们的比作为鉴别准则,这样,从根本上避免了类内散布矩阵奇异带来的困难。在ORL人脸数据库和AR人脸数据库上的实验结果验证算法的有效性。  相似文献   

14.
基于对称线性判别分析算法的人脸识别   总被引:1,自引:0,他引:1  
王伟  张明 《计算机应用》2009,29(12):3352-3353
小样本问题的存在使得类内离散度矩阵为奇异阵,因此求解线性判别分析(LDA)算法的广义特征方程存在病态奇异问题。为解决此问题,在已有算法的基础上,引入镜像图像来扩大样本容量,并采用零空间的方法求得Fisher准则函数的最优解。通过在ORL和Yale标准人脸库上的实验结果表明,人脸识别效果优于传统LDA方法、独立成分分析(ICA)方法以及二维对称主成分分析(2DSPCA)方法。  相似文献   

15.
Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) techniques and obtains discriminant projections by maximizing the ratio of average-case between-class scatter to average-case within-class scatter. Two recent discriminant analysis algorithms (DAS), minimal distance maximization (MDM) and worst-case LDA (WLDA), get projections by optimizing worst-case scatters. In this paper, we develop a new LDA framework called LDA with worst between-class separation and average within-class compactness (WSAC) by maximizing the ratio of worst-case between-class scatter to average-case within-class scatter. This can be achieved by relaxing the trace ratio optimization to a distance metric learning problem. Comparative experiments demonstrate its effectiveness. In addition, DA counterparts using the local geometry of data and the kernel trick can likewise be embedded into our framework and be solved in the same way.  相似文献   

16.
The purpose of conventional linear discriminant analysis (LDA) is to find an orientation which projects high dimensional feature vectors of different classes to a more manageable low dimensional space in the most discriminative way for classification. The LDA technique utilizes an eigenvalue decomposition (EVD) method to find such an orientation. This computation is usually adversely affected by the small sample size problem. In this paper we have presented a new direct LDA method (called gradient LDA) for computing the orientation especially for small sample size problem. The gradient descent based method is used for this purpose. It also avoids discarding the null space of within-class scatter matrix and between-class scatter matrix which may have discriminative information useful for classification.  相似文献   

17.
The feature extraction is an important preprocessing step of the classification procedure particularly in high-dimensional data with limited number of training samples. Conventional supervised feature extraction methods, for example, linear discriminant analysis (LDA), generalized discriminant analysis, and non-parametric weighted feature extraction ones, need to calculate scatter matrices. In these methods, within-class and between-class scatter matrices are used to formulate the criterion of class separability. Because of the limited number of training samples, the accurate estimation of these matrices is not possible. So the classification accuracy of these methods falls in a small sample size situation. To cope with this problem, a new supervised feature extraction method namely, feature extraction using attraction points (FEUAP) has been recently proposed in which no statistical moments are used. Thus, it works well using limited training samples. To take advantage of this method and LDA one, this article combines them by a dyadic scheme. In the proposed scheme, the similar classes are grouped hierarchically by the k-means algorithm so that a tree with some nodes is constructed. Then the class of each pixel is determined from this scheme. To determine the class of each pixel, depending on the node of the tree, we use FEUAP or LDA for a limited or large number of training samples, respectively. The experimental results demonstrate the better performance of the proposed hybrid method in comparison with other supervised feature extraction methods in a small sample size situation.  相似文献   

18.
Linear discriminant analysis (LDA) is a data discrimination technique that seeks transformation to maximize the ratio of the between-class scatter and the within-class scatter. While it has been successfully applied to several applications, it has two limitations, both concerning the underfitting problem. First, it fails to discriminate data with complex distributions since all data in each class are assumed to be distributed in the Gaussian manner. Second, it can lose class-wise information, since it produces only one transformation over the entire range of classes. We propose three extensions of LDA to overcome the above problems. The first extension overcomes the first problem by modelling the within-class scatter using a PCA mixture model that can represent more complex distribution. The second extension overcomes the second problem by taking different transformation for each class in order to provide class-wise features. The third extension combines these two modifications by representing each class in terms of the PCA mixture model and taking different transformation for each mixture component. It is shown that all our proposed extensions of LDA outperform LDA concerning classification errors for synthetic data classification, hand-written digit recognition, and alphabet recognition.  相似文献   

19.
This paper develops a generalized nonlinear discriminant analysis (GNDA) method and deals with its small sample size (SSS) problems. GNDA is a nonlinear extension of linear discriminant analysis (LDA), while kernel Fisher discriminant analysis (KFDA) can be regarded as a special case of GNDA. In LDA, an under sample problem or a small sample size problem occurs when the sample size is less than the sample dimensionality, which will result in the singularity of the within-class scatter matrix. Due to a high-dimensional nonlinear mapping in GNDA, small sample size problems arise rather frequently. To tackle this issue, this research presents five different schemes for GNDA to solve the SSS problems. Experimental results on real-world data sets show that these schemes for GNDA are very effective in tackling small sample size problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号