共查询到10条相似文献,搜索用时 187 毫秒
1.
J. Muñoz-Pérez O. D. de Cózar-Macías E. B. Blázquez-Parra I. Ladrón de Guevara-López 《Journal of Mathematical Imaging and Vision》2014,49(2):492-509
Geometric fitting is present in different fields of science, engineering and astronomy. In particular, ellipse shapes are some of the most commonly employed geometric features in digital image analysis and visual pattern recognition. Most geometric and algebraic methods are sensitive to noise and outlier points and so the results are not usually acceptable. In this paper, a robust geometric multicriteria method based on the mean absolute geometric error and the eccentricity to fit an ellipse to set of points is proposed. It is well known that the least mean absolute error criterion leads to robust estimations. The experimental results on different real and synthetic data have shown that the proposed algorithm is robust to outliers. Moreover, it allows us to identify outliers and remove them. 相似文献
2.
Using symmetry in robust model fitting 总被引:1,自引:0,他引:1
The pattern recognition and computer vision communities often employ robust methods for model fitting. In particular, high breakdown-point methods such as least median of squares (LMedS) and least trimmed squares (LTS) have often been used in situations where the data are contaminated with outliers. However, though the breakdown point of these methods can be as high as 50% (they can be robust to up to 50% contamination), they can break down at unexpectedly lower percentages when the outliers are clustered. In this paper, we demonstrate the fragility of LMedS and LTS and analyze the reasons that cause the fragility of these methods in the situation when a large percentage of clustered outliers exist in the data. We adapt the concept of “symmetry distance” to formulate an improved regression method, called the least trimmed symmetry distance (LTSD). Experimental results are presented to show that the LTSD performs better than LMedS and LTS under a large percentage of clustered outliers and large standard variance of inliers. 相似文献
3.
Wang Hanzi Mirota Daniel Hager Gregory D. 《IEEE transactions on pattern analysis and machine intelligence》2010,32(1):178-184
In this paper, we present a new Adaptive-Scale Kernel Consensus (ASKC) robust estimator as a generalization of the popular and state-of-the-art robust estimators such as RANdom SAmple Consensus (RANSAC), Adaptive Scale Sample Consensus (ASSC), and Maximum Kernel Density Estimator (MKDE). The ASKC framework is grounded on and unifies these robust estimators using nonparametric kernel density estimation theory. In particular, we show that each of these methods is a special case of ASKC using a specific kernel. Like these methods, ASKC can tolerate more than 50 percent outliers, but it can also automatically estimate the scale of inliers. We apply ASKC to two important areas in computer vision, robust motion estimation and pose estimation, and show comparative results on both synthetic and real data. 相似文献
4.
A Framework for Robust Subspace Learning 总被引:8,自引:0,他引:8
De la Torre Fernando Black Michael J. 《International Journal of Computer Vision》2003,54(1-3):117-142
Many computer vision, signal processing and statistical problems can be posed as problems of learning low dimensional linear or multi-linear models. These models have been widely used for the representation of shape, appearance, motion, etc., in computer vision applications. Methods for learning linear models can be seen as a special case of subspace fitting. One draw-back of previous learning methods is that they are based on least squares estimation techniques and hence fail to account for outliers which are common in realistic training sets. We review previous approaches for making linear learning methods robust to outliers and present a new method that uses an intra-sample outlier process to account for pixel outliers. We develop the theory of Robust Subspace Learning (RSL) for linear models within a continuous optimization framework based on robust M-estimation. The framework applies to a variety of linear learning problems in computer vision including eigen-analysis and structure from motion. Several synthetic and natural examples are used to develop and illustrate the theory and applications of robust subspace learning in computer vision. 相似文献
5.
6.
When fitting models to data containing multiple structures, such as when fitting surface patches to data taken from a neighborhood that includes a range discontinuity, robust estimators must tolerate both gross outliers and pseudo outliers. Pseudo outliers are outliers to the structure of interest, but inliers to a different structure. They differ from gross outliers because of their coherence. Such data occurs frequently in computer vision problems, including motion estimation, model fitting, and range data analysis. The focus in this paper is the problem of fitting surfaces near discontinuities in range data. To characterize the performance of least median of the squares, least trimmed squares, M-estimators, Hough transforms, RANSAC, and MINPRAN on this type of data, the “pseudo outlier bias” metric is developed using techniques from the robust statistics literature, and it is used to study the error in robust fits caused by distributions modeling various types of discontinuities. The results show each robust estimator to be biased at small, but substantial, discontinuities. They also show the circumstances under which different estimators are most effective. Most importantly, the results imply present estimators should be used with care, and new estimators should be developed 相似文献
7.
Robust distances are mainly used for the purpose of detecting multivariate outliers. The precise definition of cut-off values for formal outlier testing assumes that the “good” part of the data comes from a multivariate normal population. Robust distances also provide valuable information on the units not declared to be outliers and, under mild regularity conditions, they can be used to test the postulated hypothesis of multivariate normality of the uncontaminated data. This approach is not influenced by nasty outliers and thus provides a robust alternative to classical tests for multivariate normality relying on Mahalanobis distances. One major advantage of the suggested procedure is that it takes into account the effect induced by trimming of outliers in several ways. First, it is shown that stochastic trimming is an important ingredient for the purpose of obtaining a reliable estimate of the number of “good” observations. Second, trimming must be allowed for in the empirical distribution of the robust distances when comparing them to their nominal distribution. Finally, alternative trimming rules can be exploited by controlling alternative error rates, such as the False Discovery Rate. Numerical evidence based on simulated and real data shows that the proposed method performs well in a variety of situations of practical interest. It is thus a valuable companion to the existing outlier detection tools for the robust analysis of complex multivariate data structures. 相似文献
8.
We present a robust framework for extracting lines of curvature from point clouds. First, we show a novel approach to denoising the input point cloud using robust statistical estimates of surface normal and curvature which automatically rejects outliers and corrects points by energy minimization. Then the lines of curvature are constructed on the point cloud with controllable density. Our approach is applicable to surfaces of arbitrary genus, with or without boundaries, and is statistically robust to noise and outliers while preserving sharp surface features. We show our approach to be effective over a range of synthetic and real-world input datasets with varying amounts of noise and outliers. The extraction of curvature information can benefit many applications in CAD, computer vision and graphics for point cloud shape analysis, recognition and segmentation. Here, we show the possibility of using the lines of curvature for feature-preserving mesh construction directly from noisy point clouds. 相似文献
9.
The problem of selecting variables or features in a regression model in the presence of both additive (vertical) and leverage outliers is addressed. Since variable selection and the detection of anomalous data are not separable problems, the focus is on methods that select variables and outliers simultaneously. For selection, the fast forward selection algorithm, least angle regression (LARS), is used, but it is not robust. To achieve robustness to additive outliers, a dummy variable identity matrix is appended to the design matrix allowing both real variables and additive outliers to be in the selection set. For leverage outliers, these selection methods are used on samples of elemental sets in a manner similar to that used in high breakdown robust estimation. These results are compared to several other selection methods of varying computational complexity and robustness. The extension of these methods to situations where the number of variables exceeds the number of observations is discussed. 相似文献