کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
535341 | 870341 | 2014 | 7 صفحه PDF | دانلود رایگان |

• We incorporate label information into the kernel PCA denoising procedure through generalization of semi-supervised kernel PCA.
• We provide an efficient fixed-point iteration for the pre-image problem based on a graph regularized kernel.
• We demonstrate that accounting for intrinsic manifold.
Kernel Principal Component Analysis (PCA) has proven a powerful tool for nonlinear feature extraction, and is often applied as a pre-processing step for classification algorithms. In denoising applications Kernel PCA provides the basis for dimensionality reduction, prior to the so-called pre-image problem where denoised feature space points are mapped back into input space. This problem is inherently ill-posed due to the non-bijective feature space mapping. We present a semi-supervised denoising scheme based on kernel PCA and the pre-image problem, where class labels on a subset of the data points are used to improve the denoising. Moreover, by warping the Reproducing Kernel Hilbert Space (RKHS) we also account for the intrinsic manifold structure yielding a Kernel PCA basis that also benefit from unlabeled data points. Our two main contributions are; (1) a generalization of Kernel PCA by incorporating a loss term, leading to an iterative algorithm for finding orthonormal components biased by the class labels, and (2) a fixed-point iteration for solving the pre-image problem based on a manifold warped RKHS. We prove viability of the proposed methods on both synthetic data and images from The Amsterdam Library of Object Images (Geusebroek et al., 2005) [7].
Journal: Pattern Recognition Letters - Volume 49, 1 November 2014, Pages 114–120