Article ID Journal Published Year Pages File Type
405491 Neural Networks 2012 10 Pages PDF
Abstract

Canonical correlation analysis (CCA) is a classical dimensionality reduction technique for two sets of variables that iteratively finds projection directions with maximum correlation. Although CCA is still in vital use in many practical application areas, recent real-world data often contain more complicated nonlinear correlations that cannot be properly captured by classical CCA. In this paper, we thus propose an extension of CCA that can effectively capture such complicated nonlinear correlations through statistical dependency maximization. The proposed method, which we call least-squares canonical dependency analysis (LSCDA), is based on a squared-loss variant of mutual information, and it has various useful properties besides its ability to capture higher-order correlations: for example, it can simultaneously find multiple projection directions (i.e., subspaces), it does not involve density estimation, and it is equipped with a model selection strategy. We demonstrate the usefulness of LSCDA through various experiments on artificial and real-world datasets.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,