Article ID Journal Published Year Pages File Type
535103 Pattern Recognition Letters 2016 8 Pages PDF
Abstract

•General method for approximating eigen-decomposition of a kernel matrix.•A randomized method for approximating eigen-decomposition.•Link between eigen-decomposition in subspace and the dot product preservation.•Novel link between empirical kernel map and the kernel matrix.•The proposed method can be used with any kernel functions.

Kernel principal component analysis (KPCA) is a popular extension of the classical PCA that allows non-linear subspace projection. It is based on eigen-decomposition of the kernel matrix. The main crux of KPCA lies in the computational cost of the eigen-decomposition step. In this paper, we show that this decomposition can also be done by analyzing the covariance matrix obtained from the empirical kernel map. We can further reduce the computational cost by combining the empirical kernel map with random projection. Experimental results show that the proposed method accurately approximates the eigenvalues/eigenvectors of the original kernel matrix.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
,