Learning nonlinear manifolds based on mixtures of localized linear manifolds under a self-organizing framework |
| |
Authors: | Huicheng Wei Qionghai Sanqing Zhe-Ming |
| |
Affiliation: | aSchool of Information Science and Technology, Sun Yat-sen University, 510275 Guangzhou, China;bDepartment of Automation, Tsinghua University, 100084 Beijing, China;cSchool of Biomedical Engineering, Drexel University, PA, 19104, USA;dSchool of Aeronautics and Astronautics, Zhejiang University, 310058 Hangzhou, China |
| |
Abstract: | This paper presents a neural model which learns low-dimensional nonlinear manifolds embedded in higher-dimensional data space based on mixtures of local linear manifolds under a self-organizing framework. Compared to other similar networks, the local linear manifolds learned by our network have a more localized representation of local data distributions thanks to a new distortion measure, which removes confusion between sub-models that exists in many similar mixture models. Each neuron in the network asymptotically learns a mean vector and a principal subspace of the data in its local region. It is proved that there is no local extremum for each sub-model. Experiments show that the new mixture model is better adapted to nonlinear manifolds of various data distributions than other similar models. The online-learning property of this model is desirable when the data set is very large, when computational efficiency is of paramount importance, or when data are sequentially input. We further show an application of this model to recognition of handwritten digit images based on mixtures of local linear manifolds. |
| |
Keywords: | Self-organizing network Manifold learning Principal subspace Mixture of experts Dimension reduction |
本文献已被 ScienceDirect 等数据库收录! |
|