首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 453 毫秒
1.
基于信号稀疏特性和核函数的非线性盲信号分离算法   总被引:1,自引:0,他引:1  
文章结合核函数,把基于信号稀疏特性的线性盲分离方法应用于非线性混叠情况而给出了一种非线性混叠信号盲分离算法。该算法首先将混叠信号映射到高维核特征空间,其次,在核特征空间中构造一组正交基,通过这组正交基将高维核特征空间的信号映射到这组正交基张成的参数空间中,从而把非线性混叠信号盲分离问题转化为参数空间的线性混叠信号盲分离问题。最后,在参数空间中,应用基于信号稀疏特性的线性盲分离方法对信号进行分离。该算法收敛精度较高,稳定性好。仿真结果表明该算法是有效的,具有良好的分离性能。  相似文献   

2.
基于核规范变量分析的非线性故障诊断方法   总被引:1,自引:1,他引:0  
邓晓刚  田学民 《控制与决策》2006,21(10):1109-1113
提出一种基于核规范变量分析(KCVA)的非线性过程故障诊断方法.该方法使用核函数完成非线性空间到高维线性空间的映射,避免了高维空间中的数据处理和非线性映射函数的使用.在线性空间中使用规范变量分析(CVA)来辨识状态空闻模型,从数据中提取状态信息.3个监测量(Tr^2,Ts^2,Q)用来进行故障检测,同时使用贡献图分离故障变量,并判断故障原因.在CSTR系统上的仿真结果表明,KCVA方法比主元分析法(PCA)和CVA方法能更灵敏地检测到故障的发生,更有效地监控过程变化.  相似文献   

3.
本文提出了一种基于核函数的杂系盲源分离算法,即KFBSS算法。该算法通过引入非线性核函数和平滑参数h,将分离信号进行非线性核映射,最优化平滑参数h,同时更新混合分离矩阵,通过不断迭代学习,对混合信号进行盲源分离。仿真结果表明,与EASI算法、白化算法、自然梯度算法相比,本文方法能更有效的分离同系混合或杂系混合信号,收敛速度更快,且能够适应于非平稳环境,具有一定的实用性。  相似文献   

4.
In this paper we present a new distance metric that incorporates the distance variation in a cluster to regularize the distance between a data point and the cluster centroid. It is then applied to the conventional fuzzy C-means (FCM) clustering in data space and the kernel fuzzy C-means (KFCM) clustering in a high-dimensional feature space. Experiments on two-dimensional artificial data sets, real data sets from public data libraries and color image segmentation have shown that the proposed FCM and KFCM with the new distance metric generally have better performance on non-spherically distributed data with uneven density for linear and nonlinear separation.  相似文献   

5.
李军  郭琳 《控制与决策》2013,28(7):972-977
基于核学习的非线性映射能力,提出一种小波核广义方差的核独立成分分析算法WKGV-KICA.小波核函数具有近似正交,适用于信号局部分析的优点.与互信息相联系,将核广义方差作为对比函数对统计独立性进行衡量,可以获得理想的数学特性.将该算法应用于宽范围的盲源分离问题的实例中,并与现有算法进了比较.实验结果表明, WKGV-KICA算法在同等条件下的分离精度更高,而且性能更好.  相似文献   

6.
不良文本识别的实际应用中,大多数文本之间总有交界甚至彼此掺杂,这种非线性不可分问题给不良文本识别带来了难度。应用 SVM 通过非线性变换可以使原空间转化为某个高维空间中的线性问题,而选择合适的核函数是 SVM 的关键。由于单核无法兼顾对独立的不良词汇和词汇组合的识别,使识别准确率不高,而且也无法兼顾召回率。针对不良文本识别的特定应用,依据 Mercer 定理结合线性核与多项式核提出了一种新的组合核函数,这种组合核函数能兼顾线性核与多项式核的优势,能够实现对独立的不良词汇以及词汇组合进行识别。在仿真实验中评估了线性核、齐次多项式核以及组合核函数,实验结果表明组合核函数的识别准确率与召回率都比较理想。  相似文献   

7.
针对多向主元分析(MPCA)不能提取复杂的非线性系统变量间的非线性特性以及T2统计量置信限的确定是以主元得分呈正态分布为假设前提的情况,提出了一种基于自组织神经网络与核密度估计的非线性MPCA在线故障监测方法.该方法用自组织神经网络去提取变量间的非线性特征信息;用核概率密度函数去估计非线性主元的置信限.将该方法应用到β-甘露聚糖酶补料分批发酵过程的在线故障监测中,应用效果表明用非线性主元比用同样数目的线性主元能够获取更多的变量信息,并且用核密度估计置信限的方法比用参数估计的方法能更准确地对故障进行监测.  相似文献   

8.
吕冰  王士同 《计算机应用》2006,26(11):2781-2783
提出了一种基于核技术的求多元区别分析最佳解的K1PMDA算法,并把这一算法应用于人脸识别中。对线性人脸识别中存在两个突出问题:1、在光照、表情、姿态变化较大时,人脸图像分类是复杂的、非线性的;2、小样本问题,即当训练样本数量小于样本特征空间维数时,导致类内散布矩阵奇异。对于前一个问题,可以采用核技术提取人脸图像样本的非线性特征,对于后一个问题,采用加入一个扰动参数的扰动算法。通过对ORL,Yale Group B以及UMIST三个人脸库的实验表明,该算法是可行的、高效的。  相似文献   

9.
Independent component analysis based on nonparametric density estimation   总被引:12,自引:0,他引:12  
In this paper, we introduce a novel independent component analysis (ICA) algorithm, which is truly blind to the particular underlying distribution of the mixed signals. Using a nonparametric kernel density estimation technique, the algorithm performs simultaneously the estimation of the unknown probability density functions of the source signals and the estimation of the unmixing matrix. Following the proposed approach, the blind signal separation framework can be posed as a nonlinear optimization problem, where a closed form expression of the cost function is available, and only the elements of the unmixing matrix appear as unknowns. We conducted a series of Monte Carlo simulations, involving linear mixtures of various source signals with different statistical characteristics and sample sizes. The new algorithm not only consistently outperformed all state-of-the-art ICA methods, but also demonstrated the following properties: 1) Only a flexible model, capable of learning the source statistics, can consistently achieve an accurate separation of all the mixed signals. 2) Adopting a suitably designed optimization framework, it is possible to derive a flexible ICA algorithm that matches the stability and convergence properties of conventional algorithms. 3) A nonparametric approach does not necessarily require large sample sizes in order to outperform methods with fixed or partially adaptive contrast functions.  相似文献   

10.
Linear regression uses the least square algorithm to solve the solution of linear regression equation. Linear regression classification (LRC) shows good classification performance on face image data. However, when the axes of linear regression of class-specific samples have intersections, LRC could not well classify the samples that distribute around intersections. Moreover, the LRC could not perform well at the situation of severe lighting variations. This paper proposes a new classification method, kernel linear regression classification (KLRC), based on LRC and the kernel trick. KLRC is a nonlinear extension of LRC and can offset the drawback of LRC. KLRC implicitly maps the data into a high-dimensional kernel space by using the nonlinear mapping determined by a kernel function. Through this mapping, KLRC is able to make the data more linearly separable and can perform well for face recognition with varying lighting. For comparison, we conduct on three standard databases under some evaluation protocols. The proposed methodology not only outperforms LRC but also takes the better performance than typical kernel methods such as kernel linear discriminant analysis and kernel principal component analysis.  相似文献   

11.
Linear subspace analysis methods have been successfully applied to extract features for face recognition.But they are inadequate to represent the complex and nonlinear variations of real face images,such as illumination,facial expression and pose variations,because of their linear properties.In this paper,a nonlinear subspace analysis method,Kernel-based Nonlinear Discriminant Analysis (KNDA),is presented for face recognition,which combines the nonlinear kernel trick with the linear subspace analysis method-Fisher Linear Discriminant Analysis (FLDA).First,the kernel trick is used to project the input data into an implicit feature space,then FLDA is performed in this feature space.Thus nonlinear discriminant features of the input data are yielded.In addition,in order to reduce the computational complexity,a geometry-based feature vectors selection scheme is adopted.Another similar nonlinear subspace analysis is Kernel-based Principal Component Analysis (KPCA),which combines the kernel trick with linear Principal Component Analysis (PCA).Experiments are performed with the polynomial kernel,and KNDA is compared with KPCA and FLDA.Extensive experimental results show that KNDA can give a higher recognition rate than KPCA and FLDA.  相似文献   

12.
UDP has been successfully applied in many fields, finding a subspace that maximizes the ratio of the nonlocal scatter to the local scatter. But UDP can not represent the nonlinear space well because it is a linear method in nature. Kernel methods can otherwise discover the nonlinear structure of the images. To improve the performance of UDP, kernel UDP (a nonlinear vision of UDP) is proposed for face feature extraction and face recognition via kernel tricks in this paper. We formulate the kernel UDP theory and develop a two-stage method to extract kernel UDP features: namely weighted Kernel PCA plus UDP. The experimental results on the FERET and ORL databases show that the proposed kernel UDP is effective.  相似文献   

13.
Using the kernel trick idea and the kernels-as-features idea, we can construct two kinds of nonlinear feature spaces, where linear feature extraction algorithms can be employed to extract nonlinear features. In this correspondence, we study the relationship between the two kernel ideas applied to certain feature extraction algorithms such as linear discriminant analysis, principal component analysis, and canonical correlation analysis. We provide a rigorous theoretical analysis and show that they are equivalent up to different scalings on each feature. These results provide a better understanding of the kernel method.  相似文献   

14.
利用组合核函数提高核主分量分析的性能   总被引:11,自引:2,他引:11  
为了提高图像分类的识别率,在对基于核的学习算法中,核函数的构成条件以及不同核函数的特性进行分析和研究的基础上,提出了一种新的核函数——组合核函数,并将它应用于核主分量分析(KPCA)中,以便进行图像特征的提取,由于新的核函数既可以提取全局特征,又可以提取局部特征,因此,可以提高KPCA在图像特征提取中的性能。为了验证所提出核函数的有效性,首先利用新的核函数进行KPCA,以便对手写数字和脸谱等图像进行特征提取,然后利用线性支持向量机(SVM)来进行识别,实验结果显示,从识别率上看,用组合核函数所提取的特征质量比原核函数所提取的特征质量高。  相似文献   

15.
In the last decades, functional magnetic resonance imaging (fMRI) has been introduced into clinical practice. As a consequence of this advanced noninvasive medical imaging technique, the analysis and visualization of medical image time-series data poses a new challenge to both research and medical application. But often, the model data for a regression or generalized linear model-based analysis are not available. Hence exploratory data-driven techniques, i.e. blind source separation (BSS) methods are very popular in functional nuclear magnetic resonance imaging (fMRI) data analysis since they are neither based on explicit signal models nor on a priori knowledge of the underlying physiological process. The independent component analysis (ICA) represents a main BSS method which searches for stochastically independent signals from the multivariate observations. In this paper, we introduce a new kernel-based nonlinear ICA method and compare it to standard BSS techniques. This kernel nonlinear ICA (kICA) overcomes the restrictions of linearity of the mixing process usually encountered with ICA. Dimension reduction is an important preprocessing step for this nonlinear technique and is performed in a novel way: a genetic algorithm is designed which determines the optimal number of basis vectors for a reduced-order feature space representation as an optimization problem of the condition number of the resulting basis. For the fMRI data, a comparative quantitative evaluation is performed between kICA with different kernels, nonnegative matrix factorization (NMF) and other BSS algorithms. The comparative results are evaluated by task-related activation maps, associated time courses and ROC study. The comparison is performed on fMRI data from experiments with 10 subjects. The external stimulus was a visual pattern presentation in a block design. The most important obtained results in this paper represent that kICA and sparse NMF (sNMF) are able to identify signal components with high correlation to the fMRI stimulus, and kICA with a Gaussian kernel is comparable to standard ICA algorithms and even more, it yields spatially focused results.  相似文献   

16.
一种支持向量机的组合核函数   总被引:11,自引:0,他引:11  
张冰  孔锐 《计算机应用》2007,27(1):44-46
核函数是支持向量机的核心,不同的核函数将产生不同的分类效果,核函数也是支持向量机理论中比较难理解的一部分。通过引入核函数,支持向量机可以很容易地实现非线性算法。首先探讨了核函数的本质,说明了核函数与所映射空间之间的关系,进一步给出了核函数的构成定理和构成方法,说明了核函数分为局部核函数与全局核函数两大类,并指出了两者的区别和各自的优势。最后,提出了一个新的核函数——组合核函数,并将该核函数应用于支持向量机中,并进行了人脸识别实验,实验结果也验证了该核函数的有效性。  相似文献   

17.
面向特定领域文本分类的实际应用,存在大量样本相互掺杂的现象,使其无法线性表述,在SVM中引入核函数可以有效地解决非线性分类的问题,而选择不同的核函数可以构造不同的SVM,其识别性能也不同,因此,选择合适的核函数及其参数优化成为SVM的关键.本文基于单核核函数的性质,对多项式核函数与径向基核函数进行线性加权,构建具有良好的泛化能力与良好的学习能力的组合核函数.仿真实验结果表明,在选择正确参数的情况下,组合核函数SVM的宏平均准确率、宏平均召回率及宏平均综合分类率都明显优于线性核、多项式核与径向基核,而且能够兼顾准确率与召回率.  相似文献   

18.
Nonlinear classification has been a non-trivial task in machine learning for the past decades. In recent years, kernel machines have successfully generalized the inner-product based linear classifiers to nonlinear ones by transforming data into some high or infinite dimensional feature space. However, due to their implicit space transformation and unobservable latent feature space, it is hard to have an intuitive understanding of their working mechanism. In this paper, we propose a comprehensible framework for nonlinear classifier design, called Manifold Mapping Machine (M3). M3 can generalize any linear classifier to nonlinear by transforming data into some low-dimensional feature space explicitly. To demonstrate the effectiveness of M3 framework, we further present an algorithmic implementation of M3 named Supervised Spectral Space Classifier (S3C). Compared with the kernel classifiers, S3C can achieve similar or even better data separation by mapping data into the low-dimensional spectral space, allowing both of its mapped data and new feature space to be examined directly. Moreover, with the discriminative information integrated into the spectral space transformation, the classification performance of S3C is more robust than that of the kernel classifiers. Experimental results show that S3C is superior to other state-of-the-art nonlinear classifiers on both synthetic and real-world data sets.  相似文献   

19.
Foley-Sammon optimal discriminant vectors using kernel approach   总被引:4,自引:0,他引:4  
A new nonlinear feature extraction method called kernel Foley-Sammon optimal discriminant vectors (KFSODVs) is presented in this paper. This new method extends the well-known Foley-Sammon optimal discriminant vectors (FSODVs) from linear domain to a nonlinear domain via the kernel trick that has been used in support vector machine (SVM) and other commonly used kernel-based learning algorithms. The proposed method also provides an effective technique to solve the so-called small sample size (SSS) problem which exists in many classification problems such as face recognition. We give the derivation of KFSODV and conduct experiments on both simulated and real data sets to confirm that the KFSODV method is superior to the previous commonly used kernel-based learning algorithms in terms of the performance of discrimination.  相似文献   

20.
应用统计学习理论中的核化原理,可以将许多线性特征提取算法推广至非线性特征提取算法.提出了基于核化原理的最优变换与聚类中心算法,即通过非线性变换,将数据映射到高维核空间,应用最优变换算法,实现原空间数据的非线性特征提取,而求解过程却借助"核函数",回避了复杂非线性变换的具体表达形式.新算法可提取稳健的非线性鉴别特征,从而解决复杂分布数据的模式分类问题.大量数值实验表明新算法比传统的最优变换与聚类中心算法更有效,甚至优于经典的核Fisher判别分析.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号