首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A novel fuzzy nonlinear classifier, called kernel fuzzy discriminant analysis (KFDA), is proposed to deal with linear non-separable problem. With kernel methods KFDA can perform efficient classification in kernel feature space. Through some nonlinear mapping the input data can be mapped implicitly into a high-dimensional kernel feature space where nonlinear pattern now appears linear. Different from fuzzy discriminant analysis (FDA) which is based on Euclidean distance, KFDA uses kernel-induced distance. Theoretical analysis and experimental results show that the proposed classifier compares favorably with FDA.  相似文献   

2.
Variable selection serves a dual purpose in statistical classification problems: it enables one to identify the input variables which separate the groups well, and a classification rule based on these variables frequently has a lower error rate than the rule based on all the input variables. Kernel Fisher discriminant analysis (KFDA) is a recently proposed powerful classification procedure, frequently applied in cases characterised by large numbers of input variables. The important problem of eliminating redundant input variables before implementing KFDA is addressed in this paper. A backward elimination approach is recommended, and two criteria which can be used for recursive elimination of input variables are proposed and investigated. Their performance is evaluated on several data sets and in a simulation study.  相似文献   

3.
Uncorrelated discriminant vectors using a kernel method are proposed in this paper. In some sense, kernel uncorrelated discriminant vectors extend Jin's method and then several related theorems are stated. Most importantly, the proposed method can deal with nonlinear problems. Finally, experimental results on handwritten numeral characters show that the proposed method is effective and feasible.  相似文献   

4.
Feature extraction is among the most important problems in face recognition systems. In this paper, we propose an enhanced kernel discriminant analysis (KDA) algorithm called kernel fractional-step discriminant analysis (KFDA) for nonlinear feature extraction and dimensionality reduction. Not only can this new algorithm, like other kernel methods, deal with nonlinearity required for many face recognition tasks, it can also outperform traditional KDA algorithms in resisting the adverse effects due to outlier classes. Moreover, to further strengthen the overall performance of KDA algorithms for face recognition, we propose two new kernel functions: cosine fractional-power polynomial kernel and non-normal Gaussian RBF kernel. We perform extensive comparative studies based on the YaleB and FERET face databases. Experimental results show that our KFDA algorithm outperforms traditional kernel principal component analysis (KPCA) and KDA algorithms. Moreover, further improvement can be obtained when the two new kernel functions are used.  相似文献   

5.
Dimensionality reduction via canonical variate analysis (CVA) is important for pattern recognition and has been extended variously to permit more flexibility, e.g. by “kernelizing” the formulation. This can lead to over-fitting, usually ameliorated by regularization. Here, a method for sparse, multinomial kernel discriminant analysis (sMKDA) is proposed, using a sparse basis to control complexity. It is based on the connection between CVA and least-squares, and uses forward selection via orthogonal least-squares to approximate a basis, generalizing a similar approach for binomial problems. Classification can be performed directly via minimum Mahalanobis distance in the canonical variates. sMKDA achieves state-of-the-art performance in terms of accuracy and sparseness on 11 benchmark datasets.  相似文献   

6.
A reformative kernel Fisher discriminant method is proposed, which is directly derived from the naive kernel Fisher discriminant analysis with superiority in classification efficiency. In the novel method only a part of training patterns, called “significant nodes”, are necessary to be adopted in classifying one test pattern. A recursive algorithm for selecting “significant nodes”, which is the key of the novel method, is presented in detail. The experiment on benchmarks shows that the novel method is effective and much efficient in classifying.  相似文献   

7.
In this paper, we give a theoretical analysis on kernel uncorrelated discriminant analysis (KUDA) and point out the drawbacks underlying the current KUDA algorithm which was recently introduced by Liang and Shi [Pattern Recognition 38(2) (2005) 307-310]. Then we propose an effective algorithm to overcome these drawbacks. The effectiveness of the proposed method was confirmed by experiments.  相似文献   

8.
Kernel Fisher discriminant analysis (KFDA) extracts a nonlinear feature from a sample by calculating as many kernel functions as the training samples. Thus, its computational efficiency is inversely proportional to the size of the training sample set. In this paper we propose a more approach to efficient nonlinear feature extraction, FKFDA (fast KFDA). This FKFDA consists of two parts. First, we select a portion of training samples based on two criteria produced by approximating the kernel principal component analysis (AKPCA) in the kernel feature space. Then, referring to the selected training samples as nodes, we formulate FKFDA to improve the efficiency of nonlinear feature extraction. In FKFDA, the discriminant vectors are expressed as linear combinations of nodes in the kernel feature space, and the extraction of a feature from a sample only requires calculating as many kernel functions as the nodes. Therefore, the proposed FKFDA has a much faster feature extraction procedure compared with the naive kernel-based methods. Experimental results on face recognition and benchmark datasets classification suggest that the proposed FKFDA can generate well classified features.  相似文献   

9.
In this paper, a kernelized version of clustering-based discriminant analysis is proposed that we name KCDA. The main idea is to first map the original data into another high-dimensional space, and then to perform clustering-based discriminant analysis in the feature space. Kernel fuzzy c-means algorithm is used to do clustering for each class. A group of tests on two UCI standard benchmarks have been carried out that prove our proposed method is very promising.  相似文献   

10.
This work proposes a method to decompose the kernel within-class eigenspace into two subspaces: a reliable subspace spanned mainly by the facial variation and an unreliable subspace due to limited number of training samples. A weighting function is proposed to circumvent undue scaling of eigenvectors corresponding to the unreliable small and zero eigenvalues. Eigenfeatures are then extracted by the discriminant evaluation in the whole kernel space. These efforts facilitate a discriminative and stable low-dimensional feature representation of the face image. Experimental results on FERET, ORL and GT databases show that our approach consistently outperforms other kernel based face recognition methods.
Alex KotEmail:
  相似文献   

11.
12.
A reformative kernel algorithm, which can deal with two-class problems as well as those with more than two classes, on Fisher discriminant analysis is proposed. In the novel algorithm the supposition that in feature space discriminant vector can be approximated by some linear combination of a part of training samples, called “significant nodes”, is made. If the “significant nodes” are found out, the novel algorithm on kernel Fisher discriminant analysis will be superior to the naive one in classification efficiency. In this paper, a recursive algorithm for selecting “significant nodes”, is developed in detail. Experiments show that the novel algorithm is effective and much efficient in classifying.  相似文献   

13.
A complete fuzzy discriminant analysis approach for face recognition   总被引:4,自引:0,他引:4  
In this paper, some studies have been made on the essence of fuzzy linear discriminant analysis (F-LDA) algorithm and fuzzy support vector machine (FSVM) classifier, respectively. As a kernel-based learning machine, FSVM is represented with the fuzzy membership function while realizing the same classification results with that of the conventional pair-wise classification. It outperforms other learning machines especially when unclassifiable regions still remain in those conventional classifiers. However, a serious drawback of FSVM is that the computation requirement increases rapidly with the increase of the number of classes and training sample size. To address this problem, an improved FSVM method that combines the advantages of FSVM and decision tree, called DT-FSVM, is proposed firstly. Furthermore, in the process of feature extraction, a reformative F-LDA algorithm based on the fuzzy k-nearest neighbors (FKNN) is implemented to achieve the distribution information of each original sample represented with fuzzy membership grade, which is incorporated into the redefinition of the scatter matrices. In particular, considering the fact that the outlier samples in the patterns may have some adverse influence on the classification result, we developed a novel F-LDA algorithm using a relaxed normalized condition in the definition of fuzzy membership function. Thus, the classification limitation from the outlier samples is effectively alleviated. Finally, by making full use of the fuzzy set theory, a complete F-LDA (CF-LDA) framework is developed by combining the reformative F-LDA (RF-LDA) feature extraction method and DT-FSVM classifier. This hybrid fuzzy algorithm is applied to the face recognition problem, extensive experimental studies conducted on the ORL and NUST603 face images databases demonstrate the effectiveness of the proposed algorithm.  相似文献   

14.
基于模糊一类支持向量机的核聚类算法   总被引:2,自引:0,他引:2  
引进模糊概念替代距离拒绝尺度,定义具有支持向量特性的模糊隶属度函数,以描述训练点隶属于聚类集的程度.惩罚了边缘点对聚类中心的贡献权重,从而抑制了聚类中心的偏移,在避免复杂的参数搜索过程的同时,保证了算法的鲁棒性能.仿真结果表明,在相同初始条件下,改进算法较原算法对不规则分布数据的处理效率更高.  相似文献   

15.
Linear Discriminant Analysis (LDA) is a widely used technique for pattern classification. It seeks the linear projection of the data to a low dimensional subspace where the data features can be modelled with maximal discriminative power. The main computation in LDA is the dot product between LDA base vector and the data point which involves costly element-wise floating point multiplications. In this paper, we present a fast linear discriminant analysis method called binary LDA (B-LDA), which possesses the desirable property that the subspace projection operation can be computed very efficiently. We investigate the LDA guided non-orthogonal binary subspace method to find the binary LDA bases, each of which is a linear combination of a small number of Haar-like box functions. We also show that B-LDA base vectors are nearly orthogonal to each other. As a result, in the non-orthogonal vector decomposition process, the computationally intensive pseudo-inverse projection operator can be approximated by the direct dot product without causing significant distance distortion. This direct dot product projection can be computed as a linear combination of the dot products with a small number of Haar-like box functions which can be efficiently evaluated using the integral image. The proposed approach is applied to face recognition on ORL and FERET dataset. Experiments show that the discriminative power of binary LDA is preserved and the projection computation is significantly reduced.  相似文献   

16.
Fault detection and diagnosis (FDD) in chemical process systems is an important tool for effective process monitoring to ensure the safety of a process. Multi-scale classification offers various advantages for monitoring chemical processes generally driven by events in different time and frequency domains. However, there are issues when dealing with highly interrelated, complex, and noisy databases with large dimensionality. Therefore, a new method for the FDD framework is proposed based on wavelet analysis, kernel Fisher discriminant analysis (KFDA), and support vector machine (SVM) classifiers. The main objective of this work was to combine the advantages of these tools to enhance the performance of the diagnosis on a chemical process system. Initially, a discrete wavelet transform (DWT) was applied to extract the dynamics of the process at different scales. The wavelet coefficients obtained during the analysis were reconstructed using the inverse discrete wavelet transform (IDWT) method, which were then fed into the KFDA to produce discriminant vectors. Finally, the discriminant vectors were used as inputs for the SVM classification task. The SVM classifiers were utilized to classify the feature sets extracted by the proposed method. The performance of the proposed multi-scale KFDA-SVM method for fault classification and diagnosis was analysed and compared using a simulated Tennessee Eastman process as a benchmark. The results showed the improvements of the proposed multiscale KFDA-SVM framework with an average 96.79% of classification accuracy over the multi-scale KFDA-GMM (84.94%), and the established independent component analysis-SVM method (95.78%) of the faults in the Tennessee Eastman process.  相似文献   

17.
In this paper, we present a new approach for fingerprint classification based on discrete Fourier transform (DFT) and nonlinear discriminant analysis. Utilizing the DFT and directional filters, a reliable and efficient directional image is constructed from each fingerprint image, and then nonlinear discriminant analysis is applied to the constructed directional images, reducing the dimension dramatically and extracting the discriminant features. The proposed method explores the capability of DFT and directional filtering in dealing with low-quality images and the effectiveness of nonlinear feature extraction method in fingerprint classification. Experimental results demonstrates competitive performance compared with other published results.  相似文献   

18.
In this paper, we propose a new discriminant analysis using composite features for pattern classification. A composite feature consists of a number of primitive features, each of which corresponds to an input variable. The covariance of composite features is obtained from the inner product of composite features and can be considered as a generalized form of the covariance of primitive features. It contains information on statistical dependency among multiple primitive features. A discriminant analysis (C-LDA) using the covariance of composite features is a generalization of the linear discriminant analysis (LDA). Unlike LDA, the number of extracted features can be larger than the number of classes in C-LDA, which is a desirable property especially for binary classification problems. Experimental results on several data sets indicate that C-LDA provides better classification results than other methods based on primitive features.  相似文献   

19.
提出一种非线性分类3-法——基于非线性映射的Fisher判别分析(NM-FDA).首先提取基向量;然后采用Nystrom方法,以基向量为训练样本.将形式未知的非线性映射近似表达为已知形式的非线性映射,这种近似的非线性映射将变量由非线性的输入空间转换到线性的特征子空澡;最后对映射数据进行线性Fisher判别分析.实验采用7组标准数据集,结果显示NM-FDA具有较强的分类能力.  相似文献   

20.
提出了一种核Fisher鉴别分析方法优化方案,并分别给出了解决两类分类和解决多于两类的分类问题的算法,该方案具有明显的分类效率上的优势。在这种方案的实现中,首先从总体训练样本中选择出“显著”训练样本,对测试样本的分类只依赖于测试样本与“显著”训练样本之间的核函数。还设计出了一种选择“显著”训练样本的递归算法,以降低算法的计算复杂度。将该算法应用于人脸图象数据库与“基准”数据集,得到了很好的实验效果。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号