Article ID Journal Published Year Pages File Type
410528 Neurocomputing 2009 7 Pages PDF
Abstract

Nonlinear dimensionality reduction is a challenging problem encountered in a variety of high dimensional data analysis, including machine learning, pattern recognition, scientific visualization, and neural computation. Based on the different geometric intuitions of manifolds, maximum variance unfolding (MVU) and Laplacian eigenmaps are designed for detecting the different aspects of dataset. In this paper, combining the ideas of MVU and Laplacian eigenmaps, we propose a new nonlinear dimensionality reduction method called distinguishing variance embedding (DVE). DVE unfolds the dataset by maximizing the global variance subject to the proximity relation preservation constraint originated in Laplacian eigenmaps. We illustrate the algorithm on easily visualized examples of curves and surfaces, as well as on the actual images of rotating objects, faces, and handwritten digits.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,