Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
412774 | Neurocomputing | 2010 | 10 Pages |
This paper focuses on developing a new framework of kernelizing Mahalanobis distance learners. The new KPCA trick framework offers several practical advantages over the classical kernel trick framework, e.g. no mathematical formulas and no reprogramming are required for a kernel implementation, a way to speed up an algorithm is provided with no extra work, the framework avoids troublesome problems such as singularity. Rigorous representer theorems in countably infinite dimensional spaces are given to validate our framework. Furthermore, unlike previous works which always apply brute force methods to select a kernel, we derive a kernel alignment formula based on quadratic programming which can efficiently construct an appropriate kernel for a given dataset.