首页 | 本学科首页   官方微博 | 高级检索  
     


Handbook of Cbemometrics and Qualimetrics: Part A
Authors:Eric R. Ziegel
Affiliation:Amoco Corporation
Abstract:Both principal components analysis (PCA) and orthogonal regression deal with finding a p-dimensional linear manifold minimizing a scale of the orthogonal distances of the m-dimensional data points to the manifold. The main conceptual difference is that in PCA p is estimated from the data, to attain a small proportion of unexplained variability, whereas in orthogonal regression p equals m ? 1. The two main approaches to robust PCA are using the eigenvectors of a robust covariance matrix and searching for the projections that maximize or minimize a robust (univariate) dispersion measure. This article is more akin to second approach. But rather than finding the components one by one, we directly undertake the problem of finding, for a given p, a p-dimensional linear manifold minimizing a robust scale of the orthogonal distances of the data points to the manifold. The scale may be either a smooth M-scale or a “trimmed” scale. An iterative algorithm is developed that is shown to converge to a local minimum. A strategy based on random search is used to approximate a global minimum. The procedure is shown to be faster than other high-breakdown-point competitors, especially for large m. The case whereas p = m ? 1 yields orthogonal regression. For PCA, a computationally efficient method to choose p is given. Comparisons based on both simulated and real data show that the proposed procedure is more robust than its competitors.
Keywords:High breakdown point  M-scale
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号