共查询到20条相似文献,搜索用时 15 毫秒
1.
Yong Xu Author Vitae Jing-yu Yang Author VitaeAuthor Vitae 《Pattern recognition》2004,37(6):1299-1302
A reformative kernel Fisher discriminant method is proposed, which is directly derived from the naive kernel Fisher discriminant analysis with superiority in classification efficiency. In the novel method only a part of training patterns, called “significant nodes”, are necessary to be adopted in classifying one test pattern. A recursive algorithm for selecting “significant nodes”, which is the key of the novel method, is presented in detail. The experiment on benchmarks shows that the novel method is effective and much efficient in classifying. 相似文献
2.
Fast kernel Fisher discriminant analysis via approximating the kernel principal component analysis 总被引:2,自引:0,他引:2
Jinghua WangAuthor Vitae Qin LiAuthor VitaeJane YouAuthor Vitae Qijun ZhaoAuthor Vitae 《Neurocomputing》2011,74(17):3313-3322
Kernel Fisher discriminant analysis (KFDA) extracts a nonlinear feature from a sample by calculating as many kernel functions as the training samples. Thus, its computational efficiency is inversely proportional to the size of the training sample set. In this paper we propose a more approach to efficient nonlinear feature extraction, FKFDA (fast KFDA). This FKFDA consists of two parts. First, we select a portion of training samples based on two criteria produced by approximating the kernel principal component analysis (AKPCA) in the kernel feature space. Then, referring to the selected training samples as nodes, we formulate FKFDA to improve the efficiency of nonlinear feature extraction. In FKFDA, the discriminant vectors are expressed as linear combinations of nodes in the kernel feature space, and the extraction of a feature from a sample only requires calculating as many kernel functions as the nodes. Therefore, the proposed FKFDA has a much faster feature extraction procedure compared with the naive kernel-based methods. Experimental results on face recognition and benchmark datasets classification suggest that the proposed FKFDA can generate well classified features. 相似文献
3.
Hong HuangAuthor Vitae Jianwei LiAuthor VitaeJiamin LiuAuthor Vitae 《Future Generation Computer Systems》2012,28(1):244-253
An improved manifold learning method, called enhanced semi-supervised local Fisher discriminant analysis (ESELF), for face recognition is proposed. Motivated by the fact that statistically uncorrelated and parameter-free are two desirable and promising characteristics for dimension reduction, a new difference-based optimization objective function with unlabeled samples has been designed. The proposed method preserves the manifold structure of labeled and unlabeled samples in addition to separating labeled samples in different classes from each other. The semi-supervised method has an analytic form of the globally optimal solution and it can be computed based on eigen decomposition. Experiments on synthetic data and AT&T, Yale and CMU PIE face databases are performed to test and evaluate the proposed algorithm. The experimental results and comparisons demonstrate the effectiveness of the proposed method. 相似文献
4.
Guang Dai Author Vitae Author Vitae Yun-Tao Qian Author Vitae 《Pattern recognition》2007,40(1):229-243
Feature extraction is among the most important problems in face recognition systems. In this paper, we propose an enhanced kernel discriminant analysis (KDA) algorithm called kernel fractional-step discriminant analysis (KFDA) for nonlinear feature extraction and dimensionality reduction. Not only can this new algorithm, like other kernel methods, deal with nonlinearity required for many face recognition tasks, it can also outperform traditional KDA algorithms in resisting the adverse effects due to outlier classes. Moreover, to further strengthen the overall performance of KDA algorithms for face recognition, we propose two new kernel functions: cosine fractional-power polynomial kernel and non-normal Gaussian RBF kernel. We perform extensive comparative studies based on the YaleB and FERET face databases. Experimental results show that our KFDA algorithm outperforms traditional kernel principal component analysis (KPCA) and KDA algorithms. Moreover, further improvement can be obtained when the two new kernel functions are used. 相似文献
5.
This work proposes a method to decompose the kernel within-class eigenspace into two subspaces: a reliable subspace spanned
mainly by the facial variation and an unreliable subspace due to limited number of training samples. A weighting function
is proposed to circumvent undue scaling of eigenvectors corresponding to the unreliable small and zero eigenvalues. Eigenfeatures
are then extracted by the discriminant evaluation in the whole kernel space. These efforts facilitate a discriminative and
stable low-dimensional feature representation of the face image. Experimental results on FERET, ORL and GT databases show
that our approach consistently outperforms other kernel based face recognition methods.
相似文献
Alex KotEmail: |
6.
N. Louw 《Computational statistics & data analysis》2006,51(3):2043-2055
Variable selection serves a dual purpose in statistical classification problems: it enables one to identify the input variables which separate the groups well, and a classification rule based on these variables frequently has a lower error rate than the rule based on all the input variables. Kernel Fisher discriminant analysis (KFDA) is a recently proposed powerful classification procedure, frequently applied in cases characterised by large numbers of input variables. The important problem of eliminating redundant input variables before implementing KFDA is addressed in this paper. A backward elimination approach is recommended, and two criteria which can be used for recursive elimination of input variables are proposed and investigated. Their performance is evaluated on several data sets and in a simulation study. 相似文献
7.
徐勇 《计算机工程与应用》2007,43(3):33-36,67
提出了一种核Fisher鉴别分析方法优化方案,并分别给出了解决两类分类和解决多于两类的分类问题的算法,该方案具有明显的分类效率上的优势。在这种方案的实现中,首先从总体训练样本中选择出“显著”训练样本,对测试样本的分类只依赖于测试样本与“显著”训练样本之间的核函数。还设计出了一种选择“显著”训练样本的递归算法,以降低算法的计算复杂度。将该算法应用于人脸图象数据库与“基准”数据集,得到了很好的实验效果。 相似文献
8.
Jian Yang Author Vitae Zhong Jin Author Vitae Jing-yu Yang 《Pattern recognition》2004,37(10):2097-2100
In this paper, the method of kernel Fisher discriminant (KFD) is analyzed and its nature is revealed, i.e., KFD is equivalent to kernel principal component analysis (KPCA) plus Fisher linear discriminant analysis (LDA). Based on this result, a more transparent KFD algorithm is proposed. That is, KPCA is first performed and then LDA is used for a second feature extraction in the KPCA-transformed space. Finally, the effectiveness of the proposed algorithm is verified using the CENPARMI handwritten numeral database. 相似文献
9.
We reformulate the Quadratic Programming Feature Selection (QPFS) method in a Kernel space to obtain a vector which maximizes the quadratic objective function of QPFS. We demonstrate that the vector obtained by Kernel Quadratic Programming Feature Selection is equivalent to the Kernel Fisher vector and, therefore, a new interpretation of the Kernel Fisher discriminant analysis is given which provides some computational advantages for highly unbalanced datasets. 相似文献
10.
Li ZhangAuthor Vitae Wei Da ZhouAuthor VitaePei-Chann ChangAuthor Vitae 《Neurocomputing》2011,74(4):568-574
This paper develops a generalized nonlinear discriminant analysis (GNDA) method and deals with its small sample size (SSS) problems. GNDA is a nonlinear extension of linear discriminant analysis (LDA), while kernel Fisher discriminant analysis (KFDA) can be regarded as a special case of GNDA. In LDA, an under sample problem or a small sample size problem occurs when the sample size is less than the sample dimensionality, which will result in the singularity of the within-class scatter matrix. Due to a high-dimensional nonlinear mapping in GNDA, small sample size problems arise rather frequently. To tackle this issue, this research presents five different schemes for GNDA to solve the SSS problems. Experimental results on real-world data sets show that these schemes for GNDA are very effective in tackling small sample size problems. 相似文献
11.
Hong HuangAuthor Vitae Jiamin LiuAuthor VitaeHailiang FengAuthor Vitae Tongdi HeAuthor Vitae 《Neurocomputing》2011,74(17):3103-3113
In this paper, we propose an improved manifold learning method, called uncorrelated local Fisher discriminant analysis (ULFDA), for ear recognition. Motivated by the fact that the features extracted by local Fisher discriminant analysis are statistically correlated, which may result in poor performance for recognition. The aim of ULFDA is to seek a feature submanifold such that the within-manifold scatter is minimized and between-manifold scatter is maximized simultaneously in the embedding space by using a new difference-based optimization objective function. Moreover, we impose an appropriate constraint to make the extracted features statistically uncorrelated. As a result, the proposed algorithm not only derives the optimal and lossless discriminative information, but also guarantees that all extracted features are statistically uncorrelated. Experiments on synthetic data and Spain, USTB-2 and CEID ear databases are performed to demonstrate the effectiveness of the proposed method. 相似文献
12.
This paper develops a new image feature extraction and recognition method coined two-dimensional linear discriminant analysis (2DLDA). 2DLDA provides a sequentially optimal image compression mechanism, making the discriminant information compact into the up-left corner of the image. Also, 2DLDA suggests a feature selection strategy to select the most discriminative features from the corner. 2DLDA is tested and evaluated using the AT&T face database. The experimental results show 2DLDA is more effective and computationally more efficient than the current LDA algorithms for face feature extraction and recognition. 相似文献
13.
Small sample size and high computational complexity are two major problems encountered when traditional kernel discriminant analysis methods are applied to high-dimensional pattern classification tasks such as face recognition. In this paper, we introduce a new kernel discriminant learning method, which is able to effectively address the two problems by using regularization and subspace decomposition techniques. Experiments performed on real face databases indicate that the proposed method outperforms, in terms of classification accuracy, existing kernel methods, such as kernel principal component analysis and kernel linear discriminant analysis, at a significantly reduced computational cost. 相似文献
14.
A novel fuzzy nonlinear classifier, called kernel fuzzy discriminant analysis (KFDA), is proposed to deal with linear non-separable problem. With kernel methods KFDA can perform efficient classification in kernel feature space. Through some nonlinear mapping the input data can be mapped implicitly into a high-dimensional kernel feature space where nonlinear pattern now appears linear. Different from fuzzy discriminant analysis (FDA) which is based on Euclidean distance, KFDA uses kernel-induced distance. Theoretical analysis and experimental results show that the proposed classifier compares favorably with FDA. 相似文献
15.
目前线性鉴别分析以Fisher准则或是逐对类加权Fisher准则为依据,但前者不能限制离群类,后者计算量大,鉴于此,提出一种改进Fisher准则用于线性鉴别分析。回顾了Fisher准则和逐对类加权Fisher准则,指出其中问题产生的根本原因。提出类距离和类离群程度的定义,以类距离为依据判定各类离群程度,以类离群程度为参数赋予各类权值,重新计算总体类均值和类间离散度矩阵,以得到限制离群类、突出常规类的改进Fisher准则。这种改进Fisher准则计算简单,能有效限制离群类。 相似文献
16.
17.
18.
Wen-Sheng Chu Author VitaeAuthor Vitae Jenn-Jier James Lien Author Vitae 《Pattern recognition》2011,44(8):1567-1580
This study presents a novel kernel discriminant transformation (KDT) algorithm for face recognition based on image sets. As each image set is represented by a kernel subspace, we formulate a KDT matrix that maximizes the similarities of within-kernel subspaces, and simultaneously minimizes those of between-kernel subspaces. Although the KDT matrix cannot be computed explicitly in a high-dimensional feature space, we propose an iterative kernel discriminant transformation algorithm to solve the matrix in an implicit way. Another perspective of similarity measure, namely canonical difference, is also addressed for matching each pair of the kernel subspaces, and employed to simplify the formulation. The proposed face recognition system is demonstrated to outperform existing still-image-based as well as image set-based face recognition methods using the Yale Face database B, Labeled Faces in the Wild and a self-compiled database. 相似文献
19.
Yong Xu Author Vitae David Zhang Author Vitae Zhong Jin Author Vitae Author Vitae Jing-Yu Yang Author Vitae 《Pattern recognition》2006,39(6):1026-1033
Nonlinear discriminant analysis may be transformed into the form of kernel-based discriminant analysis. Thus, the corresponding discriminant direction can be solved by linear equations. From the view of feature space, the nonlinear discriminant analysis is still a linear method, and it is provable that in feature space the method is equivalent to Fisher discriminant analysis. We consider that one linear combination of parts of training samples, called “significant nodes”, can replace the total training samples to express the corresponding discriminant vector in feature space to some extent. In this paper, an efficient algorithm is proposed to determine “significant nodes” one by one. The principle of determining “significant nodes” is simple and reasonable, and the consequent algorithm can be carried out with acceptable computation cost. Depending on the kernel functions between test samples and all “significant nodes”, classification can be implemented. The proposed method is called fast kernel-based nonlinear method (FKNM). It is noticeable that the number of “significant nodes” may be much smaller than that of the total training samples. As a result, for two-class classification problems, the FKNM will be much more efficient than the naive kernel-based nonlinear method (NKNM). The FKNM can be also applied to multi-class via two approaches: one-against-the-rest and one-against-one. Although there is a view that one-against-one is superior to one-against-the-rest in classification efficiency, it seems that for the FKNM one-against-the-rest is more efficient than one-against-one. Experiments on benchmark and real datasets illustrate that, for two-class and multi-class classifications, the FKNM is effective, feasible and much efficient. 相似文献
20.
Wenming Zheng 《Pattern recognition》2005,38(11):2185-2187
In this paper, we give a theoretical analysis on kernel uncorrelated discriminant analysis (KUDA) and point out the drawbacks underlying the current KUDA algorithm which was recently introduced by Liang and Shi [Pattern Recognition 38(2) (2005) 307-310]. Then we propose an effective algorithm to overcome these drawbacks. The effectiveness of the proposed method was confirmed by experiments. 相似文献