Article ID Journal Published Year Pages File Type
412996 Neurocomputing 2009 13 Pages PDF
Abstract

This paper presents a neural model which learns low-dimensional nonlinear manifolds embedded in higher-dimensional data space based on mixtures of local linear manifolds under a self-organizing framework. Compared to other similar networks, the local linear manifolds learned by our network have a more localized representation of local data distributions thanks to a new distortion measure, which removes confusion between sub-models that exists in many similar mixture models. Each neuron in the network asymptotically learns a mean vector and a principal subspace of the data in its local region. It is proved that there is no local extremum for each sub-model. Experiments show that the new mixture model is better adapted to nonlinear manifolds of various data distributions than other similar models. The online-learning property of this model is desirable when the data set is very large, when computational efficiency is of paramount importance, or when data are sequentially input. We further show an application of this model to recognition of handwritten digit images based on mixtures of local linear manifolds.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,