Article ID Journal Published Year Pages File Type
9653129 Neural Networks 2005 9 Pages PDF
Abstract
We introduce spectral gradient descent, a way of improving iterative dimensionality reduction techniques.1 The method uses information contained in the leading eigenvalues of a data affinity matrix to modify the steps taken during a gradient-based optimization procedure. We show that the approach is able to speed up the optimization and to help dimensionality reduction methods find better local minima of their objective functions. We also provide an interpretation of our approach in terms of the power method for finding the leading eigenvalues of a symmetric matrix and verify the usefulness of the approach in some simple experiments.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,