Article ID Journal Published Year Pages File Type
408094 Neurocomputing 2012 11 Pages PDF
Abstract

Canonical correlation analysis (CCA  ) is a well-known technique for finding the correlations between two sets of multi-dimensional variables. It projects both sets of variables into a lower-dimensional space in which they are maximally correlated. One popular use of CCA is for dimensionality reduction. CCA can be regarded as a linear subspace approach for one view of an object set (e.g. XX) which is directed by another view of the object set (e.g. YY). However, if the correlations between XX and YY are nonlinear, CCA may fail to reveal the latent structures of XX. In this paper, we propose a new nonlinear dimensionality reduction algorithm, called local canonical correlation analysis alignment (LCCA  ). In LCCA, CCA is implemented on patches of an object set to get the local low-dimensional coordinates of XpXp (XpXp is a patch of XX), then the local coordinates are aligned to obtain the global low-dimensional embeddings of XX. Furthermore, in order to solve out-of-sample problems, a linear version of LCCA (LLCCA) algorithm is also developed. Different from LCCA, LLCCA is not only suitable for training samples but also for testing samples. Experiments for data visualization and pose estimation show that LCCA and LLCCA are superior to the related algorithms.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,