共查询到20条相似文献,搜索用时 0 毫秒
1.
A new method for representing the statistical variation of FET equivalent circuit parameters (ECPs) is presented. This method utilizes a statistical technique known as principal components and provides an efficient method for statistically representing the means, standard deviations, and correlations of the FET ECPs. The technique can easily be implemented into commercial CAD simulators resulting in FET variation simulations that are more accurate than existing methods. Appropriate statistical tests for determination of equivalence between simulated and measured FET parameter distributions is also discussed. Both the modeling methodology and statistical testing were demonstrated using both scattering and noise parameters for 300 μm low-noise GaAs FETs 相似文献
2.
Fuzzy principal component analysis and its Kernel-based model 总被引:1,自引:0,他引:1
Wu Xiaohong Zhou Jianjiang 《电子科学学刊(英文版)》2007,24(6):772-775
Principal Component Analysis(PCA)is one of the most important feature extraction methods,and Kernel Principal Component Analysis(KPCA)is a nonlinear extension of PCA based on kernel methods.In real world,each input data may not be fully assigned to one class and it may partially belong to other classes.Based on the theory of fuzzy sets,this paper presents Fuzzy Principal Component Analysis(FPCA)and its nonlinear extension model,i.e.,Kernel-based Fuzzy Principal Component Analysis(KFPCA).The experimental results indicate that the proposed algorithms have good performances. 相似文献
3.
Bias-dependent linear scalable millimeter-wave FET model 总被引:1,自引:0,他引:1
This paper describes a measurement-based bias-dependent linear equivalent circuit field-effect-transistor/high-electron-mobility-transistor model that is accurate to at least 100 GHz and scalable up to 12 parallel gate fingers and from 100 to 1000 μm total gate width. A new and accurate technique for extracting the Z-shell parameters has been developed, and the scaling rules for all the parasitic elements have been determined. The intrinsic equivalent circuit element values are determined at each bias point in Vge-Vds space and interpolated by splines between points 相似文献
4.
The analysis of dynamic fluorescence diffuse optical tomography (D-FDOT) is important both for drug delivery research and for medical diagnosis and treatment. The low spatial resolution and complex kinetics, however, limit the ability of FDOT in resolving drug distributions within small animals. Principal component analysis (PCA) provides the capability of detecting and visualizing functional structures with different kinetic patterns from D-FDOT images. A particular challenge in using PCA is to reduce the level of noise in D-FDOT images. This is particularly relevant in drug study, where the time-varying fluorophore concentration (drug concentration) will result in the reconstructed images containing more noise and, therefore, affect the performance of PCA. In this paper, a new linear corrected method is proposed for modeling these time-varying fluorescence measurements before performing PCA. To evaluate the performance of the new method in resolving drug biodistribution, the metabolic processes of indocyanine green within mouse is dynamically simulated and used as the input data of PCA. Simulation results suggest that the principal component (PC) images generated using the new method improve SNR and discrimination capability, compared to the PC images generated using the uncorrected D-FDOT images. 相似文献
5.
基于核主元分析和Fisher线性判别的掌纹识别 总被引:3,自引:0,他引:3
提出了基于核主元分析(KPCA)和FLD相结合的掌纹识别方法.对每幅掌纹图像应用KPCA进行降维,然后将二维图像矩阵转换成一维图像矢量.PolyU掌纹图像库中所有图像矢量组成的数据矩阵作为FLD的输入,进行特征提取,计算特征矢量间的余弦距离进行掌纹匹配.实验结果说明,与传统的PCA+FLD相比,在不同的特征个数下,本文方法均取得了较小的等错率(EER),而且特征提取时间较短,运行速度较快.在三种不同的核函数中,RBF核函数的识别效果最佳,等错率最小为0. 相似文献
6.
Xifa Duan Zheng Tian Mingtao Ding Wei Zhao 《AEUE-International Journal of Electronics and Communications》2013,67(1):20-28
For pre- and post-earthquake remote-sensing images, registration is a challenging task due to the possible deformations of the objects to be registered. To overcome this problem, a registration method based on robust weighted kernel principal component analysis is proposed to precisely register the variform objects. Firstly, a robust weighted kernel principal component analysis (RWKPCA) method is developed to capture the common robust kernel principal components (RKPCs) of the variform objects. Secondly, a registration approach is derived from the projection on RKPCs. Finally, two experiments are conducted on the SAR image registration in Wenchuan earthquake on May 12, 2008, and the results showed that the method is very effective in capturing structure patterns and generalized well for registration. 相似文献
7.
Though existing state-of-the-art denoising algorithms, such as BM3D, LPG-PCA and DDF, obtain remarkable results, these methods are not good at preserving details at high noise levels, sometimes even introducing non-existent artifacts. To improve the performance of these denoising methods at high noise levels, a generic denoising framework is proposed in this paper, which is based on guided principle component analysis (GPCA). The propose framework can be split into two stages. First, we use statistic test to generate an initial denoised image through back projection, where the statistical test can detect the significantly relevant information between the denoised image and the corresponding residual image. Second, similar image patches are collected to form different patch groups, and local basis are learned from each patch group by principle component analysis. Experimental results on natural images, contaminated with Gaussian and non-Gaussian noise, verify the effectiveness of the proposed framework. 相似文献
8.
Incremental kernel principal component analysis. 总被引:3,自引:0,他引:3
The kernel principal component analysis (KPCA) has been applied in numerous image-related machine learning applications and it has exhibited superior performance over previous approaches, such as PCA. However, the standard implementation of KPCA scales badly with the problem size, making computations for large problems infeasible. Also, the "batch" nature of the standard KPCA computation method does not allow for applications that require online processing. This has somewhat restricted the domains in which KPCA can potentially be applied. This paper introduces an incremental computation algorithm for KPCA to address these two problems. The basis of the proposed solution lies in computing incremental linear PCA in the kernel induced feature space, and constructing reduced-set expansions to maintain constant update speed and memory usage. We also provide experimental results which demonstrate the effectiveness of the approach. 相似文献
9.
针对DataCastle学生成绩排名预测任务:根据学生以往的在校信息预测下学期的成绩排名,结合现有的主成分分析方法以及多元线性回归模型,本文提出了基于数据截断变换的主成分分析回归预测方法,并与其它的方法进行了比较,结果表明:基于数据截断的主成分分析回归预测方法能够更好地预测学生下个学期成绩,预测准确率达到78.57%,优于对比的模型,在最终结果排行榜中排在前百分之十,因此可以较好地作为解决其它预测分析问题的工具. 相似文献
10.
基于主成分分析的支持向量机回归预测模型 总被引:5,自引:0,他引:5
首先利用主成分分析法降低样本数据的维数,建立主成分的多元回归预测模型,其次利用支持向量机方法确定回归模型的系数,最后实例说明了该模型具有较高预测精度. 相似文献
11.
针对震前震后合成孔径雷达(SAR)图像中发生复 杂形变的目标,提 出了基于稳健的加权核主成分分析(KPCA)的配准方法。首先,提出 一种稳健的 加权KPCA(RWKPCA)方法,不仅能获得震前震后形变目标的共同稳健核主成分(RKPC s),还可 以作为异常值判别准则;其次,利用在共同RKPCs上的投影定义震前震后形变目标特 征的相似性度 量;最后,利用特征的相似性度量精确配准形变目标。对2008年5月12日汶川地震前后的S AR图像进行配准并与现有方法进行比较,结果表明,本文方法能够有效的得到形变目标的 共同RKPCs,并得到很好的配准结果。 相似文献
12.
Texture classification with kernel principal component analysis 总被引:1,自引:0,他引:1
Kernel principal component analysis (PCA) is presented as a mechanism for extracting textural information. Using the polynomial kernel, higher order correlations of input pixels can be easily used as features for classification. As a result, supervised texture classification can be performed using a neural network 相似文献
13.
Principal component analysis (PCA) can be used to encode video sequences at extremely low bit rates, e.g. 34.6 dB (PSNR) at 4.2 kbit/s. The same eigenvectors are used for encoding and decoding for this coding. Introduced is a coding scheme where eigenvectors for only part of the video frames are used for encoding but the eigenvectors for the entire frame are used for decoding. This is called asymmetric PCA coding. This reduces the complexity of encoding by ap5 times and at the same time increases the reconstruction quality for the facial part of the video with 0.4 dB (PSNR).. 相似文献
14.
Two stage principal component analysis of color 总被引:1,自引:0,他引:1
15.
New low-loss dielectric materials for dielectric resonator oscillators (DROs) have been developed. The new barium titanate material has positive and constant stability factor (3 to 10 ppm/K) and quality factors of about 2500. The achieved frequency drift of GaAs FET DROs using dielectric resonators made from the new material over ?30 to 80°C is of the order of 150 to 200 kHz at 10.9 GHz. 相似文献
16.
A new unsupervised algorithm is proposed that performs competitive principal component analysis (PCA) of a time series. A set of expert PCA networks compete, through the mixture of experts (MOE) formalism, on the basis of their ability to reconstruct the original signal. The resulting network finds an optimal projection of the input onto a reduced dimensional space as a function of the input and, hence, of time. As a byproduct, the time series is both segmented and identified according to stationary regions. Examples showing the performance of the algorithm are included 相似文献
17.
In this paper, a novel hardware attack based on principal component analysis (PCA) is proposed to break a leakage power analysis (LPA)-resistant cryptographic circuit (CC) efficiently. Although the added false keys which are used for masking the secret key of the LPA-resistant CC are secure and effective against regular LPA attacks, they may be precisely modeled by eigenvalues and eigenvectors under PCA. After performing the proposed PCA on the LPA-resistant CC, all the added false keys can be removed to expose the corresponding secret key. As shown in the result, only 2000 number of plaintexts are sufficient to crack an LPA-resistant CC by utilizing the proposed PCA-assisted LPA attacks. 相似文献
18.
《Electron Devices, IEEE Transactions on》1981,28(5):511-517
It is the purpose of this paper to develop a theory upon which the design of low noise FET amplifiers can be based. This is not a fundamenta model of the noise mechanisms in GaAs FET's, but rather, an endeavor to relate physically measurable device capacitances and resistances to the device noise figure and optimum noise source impedance. I will be shown that the noise performance of an FET can be adequately described by two uncorrelated noise sources. One, at the input of the FET, is the thermal noise generated in the various resis, tances in the gate-source loop. This noise source is frequency dependent and it can be calculated from the equivalent circuit of the FET. The second noise source, in the Output of the FET, is frequency independent, and not recognizably related to any measured parameters. This output nise is a function of drain current and voltage. The decomposition of the FET noise into two uncorrelated sources simplifies the design of broad-band low noise amplifiers. Once the equivalent circuit of a device and its noise figure at one frequency are known, the optimum noise source impedance and noise figure over a broad range of frequencies may be calculated. For the device designer this model also may be helpful in balancing input-output noise tradeoffs. 相似文献
19.
This paper proposes a new approach for watermark extraction using support vector machine (SVM) with principal component analysis (PCA) based feature reduction. In this method, the original cover image is decomposed up to three level using lifting wavelet transform (LWT), and lowpass subband is selected for data hiding purpose. The lowpass subband is divided into small blocks, and a binary watermark is embedded into the original cover image by quantizing the two maximum coefficients of the block. In order to extract watermark bits with maximum correlation, SVM based binary classification approach is incorporated. The training and testing patterns are constructed by employing a reduced set of features along with block coefficients. Firstly, different features are obtained by evaluating the statistical parameters of each block coefficients, and then PCA is utilized to reduce this feature set. As far as security is concerned, randomization of coefficients, blocks, and watermark bits enhances the security of system. Furthermore, energy compaction property of LWT increases the robustness in comparison to conventional wavelet transform. A comparison of the proposed method with some of the recent techniques shows remarkable improvement in terms of robustness and security of the watermark. 相似文献