首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 296 毫秒
1.
This paper develops an unsupervised discriminant projection (UDP) technique for dimensionality reduction of high-dimensional data in small sample size cases. UDP can be seen as a linear approximation of a multimanifolds-based learning framework which takes into account both the local and nonlocal quantities. UDP characterizes the local scatter as well as the nonlocal scatter, seeking to find a projection that simultaneously maximizes the nonlocal scatter and minimizes the local scatter. This characteristic makes UDP more intuitive and more powerful than the most up-to-date method, locality preserving projection (LPP), which considers only the local scatter for clustering or classification tasks. The proposed method is applied to face and palm biometrics and is examined using the Yale, FERET, and AR face image databases and the PolyU palmprint database. The experimental results show that UDP consistently outperforms LPP and PCA and outperforms LDA when the training sample size per class is small. This demonstrates that UDP is a good choice for real-world biometrics applications  相似文献   

2.
Marginal Fisher analysis (MFA) not only aims to maintain the original relations of neighboring data points of the same class but also wants to keep away neighboring data points of the different classes. MFA can effectively overcome the limitation of linear discriminant analysis (LDA) due to data distribution assumption and available projection directions. However, MFA confronts the undersampled problems. Generalized marginal Fisher analysis (GMFA) based on a new optimization criterion is presented, which is applicable to the undersampled problems. The solutions to the proposed criterion for GMFA are derived, which can be characterized in a closed form. Among the solutions, two specific algorithms, namely, normal MFA (NMFA) and orthogonal MFA (OMFA), are studied, and the methods to implement NMFA and OMFA are proposed. A comparative study on the undersampled problem of face recognition is conducted to evaluate NMFA and OMFA in terms of classification accuracy, which demonstrates the effectiveness of the proposed algorithms.  相似文献   

3.
Efficient and robust feature extraction by maximum margin criterion   总被引:15,自引:0,他引:15  
In pattern recognition, feature extraction techniques are widely employed to reduce the dimensionality of data and to enhance the discriminatory information. Principal component analysis (PCA) and linear discriminant analysis (LDA) are the two most popular linear dimensionality reduction methods. However, PCA is not very effective for the extraction of the most discriminant features, and LDA is not stable due to the small sample size problem . In this paper, we propose some new (linear and nonlinear) feature extractors based on maximum margin criterion (MMC). Geometrically, feature extractors based on MMC maximize the (average) margin between classes after dimensionality reduction. It is shown that MMC can represent class separability better than PCA. As a connection to LDA, we may also derive LDA from MMC by incorporating some constraints. By using some other constraints, we establish a new linear feature extractor that does not suffer from the small sample size problem, which is known to cause serious stability problems for LDA. The kernelized (nonlinear) counterpart of this linear feature extractor is also established in the paper. Our extensive experiments demonstrate that the new feature extractors are effective, stable, and efficient.  相似文献   

4.
Linear discriminant analysis (LDA) often encounters small sample size (SSS) problem for high-dimensional data. Null space linear discriminant analysis (NLDA) and linear discriminant analysis based on generalized singular value decomposition (LDA/GSVD) are two popular methods that can solve SSS problem of LDA. In this paper, we present the relation between NLDA and LDA/GSVD under a condition and at the same time propose a modified NLDA (MNLDA) algorithm which has the same discriminating power as LDA/GSVD and is more efficient. In addition, we compare the discriminating capability of NLDA and MNLDA and present our interpretation about this. Experimental results on ORL, FERET, Yale face databases, and the PolyU FKP database support our viewpoints.  相似文献   

5.
卢桂馥  林忠  金忠 《计算机科学》2010,37(5):251-253
提出了一种基于最大差值的二维边界Fisher的鉴别分析方法。该方法利用描述类间数据可分性的相似度矩阵Sp与描述类内数据紧致性的相似度矩阵Sc之差作为鉴别准则,从而避免了边界Fisher鉴别分析所遇到的小样本问题。所提方法是直接基于图像矩阵的,与以往的基于图像向量的方法相比,进一步提高了识别的正确率。另外,还揭示了基于最大差值的边界Fisher鉴别方法和边界Fisher鉴别的内在关系。在ORL和Yale人脸数据库上的实验表明,所提方法具有较高的识别率。  相似文献   

6.
A large family of algorithms - supervised or unsupervised; stemming from statistics or geometry theory - has been designed to provide different solutions to the problem of dimensionality reduction. Despite the different motivations of these algorithms, we present in this paper a general formulation known as graph embedding to unify them within a common framework. In graph embedding, each algorithm can be considered as the direct graph embedding or its linear/kernel/tensor extension of a specific intrinsic graph that describes certain desired statistical or geometric properties of a data set, with constraints from scale normalization or a penalty graph that characterizes a statistical or geometric property that should be avoided. Furthermore, the graph embedding framework can be used as a general platform for developing new dimensionality reduction algorithms. By utilizing this framework as a tool, we propose a new supervised dimensionality reduction algorithm called marginal Fisher analysis in which the intrinsic graph characterizes the intraclass compactness and connects each data point with its neighboring points of the same class, while the penalty graph connects the marginal points and characterizes the interclass separability. We show that MFA effectively overcomes the limitations of the traditional linear discriminant analysis algorithm due to data distribution assumptions and available projection directions. Real face recognition experiments show the superiority of our proposed MFA in comparison to LDA, also for corresponding kernel and tensor extensions  相似文献   

7.
最大散度差鉴别分析及人脸识别   总被引:16,自引:3,他引:13  
传统的Fisher线性鉴别分析(LDA)在人脸等高维图像识别应用中不可避免地遇到小样本问题。提出一种基于散度差准则的鉴别分析方法。与LDA方法不同的是,该方法利用样本模式的类间散布与类内散布之差而不是它们的比作为鉴别准则,这样,从根本上避免了类内散布矩阵奇异带来的困难。在ORL人脸数据库和AR人脸数据库上的实验结果验证算法的有效性。  相似文献   

8.
Maximal local interclass embedding with application to face recognition   总被引:1,自引:0,他引:1  
Dimensionality reduction of high dimensional data is involved in many problems in information processing. A new dimensionality reduction approach called maximal local interclass embedding (MLIE) is developed in this paper. MLIE can be viewed as a linear approach of a multimanifolds-based learning framework, in which the information of neighborhood is integrated with the local interclass relationships. In MLIE, the local interclass graph and the intrinsic graph are constructed to find a set of projections that maximize the local interclass scatter and the local intraclass compactness simultaneously. This characteristic makes MLIE more powerful than marginal Fisher analysis (MFA). MLIE maintains all the advantages of MFA. Moreover, the computational complexity of MLIE is less than that of MFA. The proposed algorithm is applied to face recognition. Experiments have been performed on the Yale, AR and ORL face image databases. The experimental results show that owing to the locally discriminating property, MLIE consistently outperforms up-to-date MFA, Smooth MFA, neighborhood preserving embedding and locality preserving projection in face recognition.  相似文献   

9.
This paper develops a generalized nonlinear discriminant analysis (GNDA) method and deals with its small sample size (SSS) problems. GNDA is a nonlinear extension of linear discriminant analysis (LDA), while kernel Fisher discriminant analysis (KFDA) can be regarded as a special case of GNDA. In LDA, an under sample problem or a small sample size problem occurs when the sample size is less than the sample dimensionality, which will result in the singularity of the within-class scatter matrix. Due to a high-dimensional nonlinear mapping in GNDA, small sample size problems arise rather frequently. To tackle this issue, this research presents five different schemes for GNDA to solve the SSS problems. Experimental results on real-world data sets show that these schemes for GNDA are very effective in tackling small sample size problems.  相似文献   

10.
Many pattern recognition applications involve the treatment of high-dimensional data and the small sample size problem. Principal component analysis (PCA) is a common used dimension reduction technique. Linear discriminate analysis (LDA) is often employed for classification. PCA plus LDA is a famous framework for discriminant analysis in high-dimensional space and singular cases. In this paper, we examine the theory of this framework and find out that even if there is no small sample size problem the PCA dimension reduction cannot guarantee the subsequent successful application of LDA. We thus develop an improved discriminate analysis method by introducing an inverse Fisher criterion and adding a constrain in PCA procedure so that the singularity phenomenon will not occur. Experiment results on face recognition suggest that this new approach works well and can be applied even when the number of training samples is one per class.  相似文献   

11.
Discriminative common vectors for face recognition   总被引:7,自引:0,他引:7  
In face recognition tasks, the dimension of the sample space is typically larger than the number of the samples in the training set. As a consequence, the within-class scatter matrix is singular and the linear discriminant analysis (LDA) method cannot be applied directly. This problem is known as the "small sample size" problem. In this paper, we propose a new face recognition method called the discriminative common vector method based on a variation of Fisher's linear discriminant analysis for the small sample size case. Two different algorithms are given to extract the discriminative common vectors representing each person in the training set of the face database. One algorithm uses the within-class scatter matrix of the samples in the training set while the other uses the subspace methods and the Gram-Schmidt orthogonalization procedure to obtain the discriminative common vectors. Then, the discriminative common vectors are used for classification of new faces. The proposed method yields an optimal solution for maximizing the modified Fisher's linear discriminant criterion given in the paper. Our test results show that the discriminative common vector method is superior to other methods in terms of recognition accuracy, efficiency, and numerical stability.  相似文献   

12.
Appearance-based methods, especially linear discriminant analysis (LDA), have been very successful in facial feature extraction, but the recognition performance of LDA is often degraded by the so-called "small sample size" (SSS) problem. One popular solution to the SSS problem is principal component analysis (PCA) + LDA (Fisherfaces), but the LDA in other low-dimensional subspaces may be more effective. In this correspondence, we proposed a novel fast feature extraction technique, bidirectional PCA (BDPCA) plus LDA (BDPCA + LDA), which performs an LDA in the BDPCA subspace. Two face databases, the ORL and the Facial Recognition Technology (FERET) databases, are used to evaluate BDPCA + LDA. Experimental results show that BDPCA + LDA needs less computational and memory requirements and has a higher recognition accuracy than PCA + LDA.  相似文献   

13.
张健  肖迪 《计算机工程与设计》2012,33(1):332-335,366
在人脸提取特征时,线性判别分析(LDA)方法受到光照、姿态等因素引起的高频部分影响较大,忽视了可能含有重要鉴别能力的低频信息.同时,人脸识别属于小样本问题,会使类内散布矩阵发生严重退化.针对以上两个问题,提出了一种基于多尺度自适应线性判别分析(MA-LDA)的人脸识别方法,并在ORL和Yale人脸库中进行了验证.MATLAB编程实验结果表明,该方法比传统方法有更好的性能.  相似文献   

14.
Linear discriminant analysis (LDA) often suffers from the small sample size problem when dealing with high-dimensional face data. Random subspace can effectively solve this problem by random sampling on face features. However, it remains a problem how to construct an optimal random subspace for discriminant analysis and perform the most efficient discriminant analysis on the constructed random subspace. In this paper, we propose a novel framework, random discriminant analysis (RDA), to handle this problem. Under the most suitable situation of the principal subspace, the optimal reduced dimension of the face sample is discovered to construct a random subspace where all the discriminative information in the face space is distributed in the two principal subspaces of the within-class and between-class matrices. Then we apply Fisherface and direct LDA, respectively, to the two principal subspaces for simultaneous discriminant analysis. The two sets of discriminant analysis features from dual principal subspaces are first combined at the feature level, and then all the random subspaces are further integrated at the decision level. With the discriminating information fusion at the two levels, our method can take full advantage of useful discriminant information in the face space. Extensive experiments on different face databases demonstrate its performance.  相似文献   

15.
局部保持投影(locality preserving projection,LPP)和线性鉴别分析(linear discrimin antanalysis,LDA)是两种有效的一维特征提取方法,广泛应用于人脸识别领域。但采用一维特征提取方法时会存在列向量化时样本的结构信息被破坏和样本在提取特征时必须对协方差矩阵进行特征分解,对于高维小样本的问题很容易出现协方差矩阵奇异的问题。文中提出将二维局部保持投影(2DLPP)和二维线性鉴别分析(2DLDA)这两种方法在特征层进行融合并应用在人脸识别。基于人脸库AR上的实验表明,该方法比传统的IJPP和LDA识别性能更高,因此可作为一种新的人脸识别方法。  相似文献   

16.
The purpose of conventional linear discriminant analysis (LDA) is to find an orientation which projects high dimensional feature vectors of different classes to a more manageable low dimensional space in the most discriminative way for classification. The LDA technique utilizes an eigenvalue decomposition (EVD) method to find such an orientation. This computation is usually adversely affected by the small sample size problem. In this paper we have presented a new direct LDA method (called gradient LDA) for computing the orientation especially for small sample size problem. The gradient descent based method is used for this purpose. It also avoids discarding the null space of within-class scatter matrix and between-class scatter matrix which may have discriminative information useful for classification.  相似文献   

17.
主成分分析算法(PCA)和线性鉴别分析算法(LDA)被广泛用于人脸识别技术中,但是PCA由于其计算复杂度高,致使人脸识别的实时性达不到要求。线性鉴别分析算法存在“小样本”和“边缘类”问题,降低了人脸识别的准确性。针对上述问题,提出使用二维主成分分析法(2DPCA)与改进的线性鉴别分析法相融合的方法。二维主成分分析法提取的特征比一维主成分分析法更丰富,并且降低了计算复杂度。改进的线性鉴别分析算法重新定义了样本类间离散度矩阵和Fisher准则,克服了传统线性鉴别分析算法存在的问题,保留了最有辨别力的信息,提高了算法的识别率。实验结果表明,该算法比主成分分析算法和线性鉴别分析算法具有更高的识别率,可以较好地用于人脸识别任务。  相似文献   

18.
The linear discriminant analysis (LDA) is a linear classifier which has proven to be powerful and competitive compared to the main state-of-the-art classifiers. However, the LDA algorithm assumes the sample vectors of each class are generated from underlying multivariate normal distributions of common covariance matrix with different means (i.e., homoscedastic data). This assumption has restricted the use of LDA considerably. Over the years, authors have defined several extensions to the basic formulation of LDA. One such method is the heteroscedastic LDA (HLDA) which is proposed to address the heteroscedasticity problem. Another method is the nonparametric DA (NDA) where the normality assumption is relaxed. In this paper, we propose a novel Bayesian logistic discriminant (BLD) model which can address both normality and heteroscedasticity problems. The normality assumption is relaxed by approximating the underlying distribution of each class with a mixture of Gaussians. Hence, the proposed BLD provides more flexibility and better classification performances than the LDA, HLDA and NDA. A subclass and multinomial versions of the BLD are proposed. The posterior distribution of the BLD model is elegantly approximated by a tractable Gaussian form using variational transformation and Jensen's inequality, allowing a straightforward computation of the weights. An extensive comparison of the BLD to the LDA, support vector machine (SVM), HLDA, NDA and subclass discriminant analysis (SDA), performed on artificial and real data sets, has shown the advantages and superiority of our proposed method. In particular, the experiments on face recognition have clearly shown a significant improvement of the proposed BLD over the LDA.  相似文献   

19.
改进的线性判别分析及人脸识别   总被引:1,自引:0,他引:1  
为有效解决传统LDA(线性鉴别分析)的小样本规模问题,提出一种改进的LDA算法。首先对样本进行无损降维;然后在Fisher准则基础上,用散度矩阵差代替散度矩阵的比值,避免对类内矩阵求逆的同时也降低了计算复杂度,实现有效的特征抽取;最后实现对人脸的识别。实验结果表明,该算法是有效的,优于传统LDA方法。  相似文献   

20.
Face recognition using kernel direct discriminant analysis algorithms   总被引:22,自引:0,他引:22  
Techniques that can introduce low-dimensional feature representation with enhanced discriminatory power is of paramount importance in face recognition (FR) systems. It is well known that the distribution of face images, under a perceivable variation in viewpoint, illumination or facial expression, is highly nonlinear and complex. It is, therefore, not surprising that linear techniques, such as those based on principle component analysis (PCA) or linear discriminant analysis (LDA), cannot provide reliable and robust solutions to those FR problems with complex face variations. In this paper, we propose a kernel machine-based discriminant analysis method, which deals with the nonlinearity of the face patterns' distribution. The proposed method also effectively solves the so-called "small sample size" (SSS) problem, which exists in most FR tasks. The new algorithm has been tested, in terms of classification error rate performance, on the multiview UMIST face database. Results indicate that the proposed methodology is able to achieve excellent performance with only a very small set of features being used, and its error rate is approximately 34% and 48% of those of two other commonly used kernel FR approaches, the kernel-PCA (KPCA) and the generalized discriminant analysis (GDA), respectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号