Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
410855 | Neurocomputing | 2011 | 11 Pages |
Abstract
We present an extension of the Exploratory Observation Machine (XOM) for structure-preserving dimensionality reduction. Based on minimizing the Kullback–Leibler divergence of neighborhood functions in data and image spaces, this Neighbor Embedding XOM (NE-XOM) creates a link between fast sequential online learning known from topology-preserving mappings and principled direct divergence optimization approaches. We quantitatively evaluate our method on real-world data using multiple embedding quality measures. In this comparison, NE-XOM performs as a competitive trade-off between high embedding quality and low computational expense, which motivates its further use in real-world settings throughout science and engineering.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Kerstin Bunte, Barbara Hammer, Thomas Villmann, Michael Biehl, Axel Wismüller,