Nonparametric supervised learning by linear interpolation with maximum entropy |
| |
Authors: | Gupta Maya R Gray Robert M Olshen Richard A |
| |
Affiliation: | Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA; |
| |
Abstract: | Nonparametric neighborhood methods for learning entail estimation of class conditional probabilities based on relative frequencies of samples that are "near-neighbors" of a test point. We propose and explore the behavior of a learning algorithm that uses linear interpolation and the principle of maximum entropy (LIME). We consider some theoretical properties of the LIME algorithm: LIME weights have exponential form; the estimates are consistent; and the estimates are robust to additive noise. In relation to bias reduction, we show that near-neighbors contain a test point in their convex hull asymptotically. The common linear interpolation solution used for regression on grids or look-up-tables is shown to solve a related maximum entropy problem. LIME simulation results support use of the method, and performance on a pipeline integrity classification problem demonstrates that the proposed algorithm has practical value. |
| |
Keywords: | |
本文献已被 PubMed 等数据库收录! |
|