Article ID Journal Published Year Pages File Type
402115 Knowledge-Based Systems 2016 12 Pages PDF
Abstract

Multiple Kernel Learning (MKL) is flexible in dealing with problems involving multiple and heterogeneous data sources. However, the necessity of inner-product form restricts its application since to kernelize the algorithms unsatisfying the inner-product formulation is pretty difficult. To overcome this problem, Multiple Empirical Kernel Learning (MEKL) is proposed by explicitly mapping input samples to feature spaces, in which the mapped feature vectors are explicitly presented. Most existed MEKLs optimize the learning framework by minimizing empirical risk, regularization risk and the loss term of multiple feature spaces. As little attention is paid to preserving local structure among training samples, the learned classifier might lack of locality similarity preserving property, which might result in unfavorable performance. Inspired by Locality Preserving Projection (LPP) which is to seek the optimal projection by preserving the local property of input samples, we introduce the locality preserving constraint into the learning framework to propose a novel Multiple Empirical Kernel Learning with Locality Preserving Constraint (MEKL-LPC). MEKL-LPC shows lower generalization error bound than both the Modification of Ho–Kashyap algorithm with Squared approximation of the misclassification error (MHKS) and Multi-Kernel MHKS (MultiK-MHKS) in terms of Rademacher complexity. Experiments on several real-world datasets demonstrate that MEKL-LPC outperforms the compared algorithms. The contributions of this work are: (i) originally integrating locality preserving constraint into MEKL, (ii) proposing a lower generalization error bound algorithm, i.e. MEKL-LPC.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,