Article ID Journal Published Year Pages File Type
408018 Neurocomputing 2011 9 Pages PDF
Abstract

We propose Kernel Self-optimized Locality Preserving Discriminant Analysis (KSLPDA) for feature extraction and recognition. The procedure of KSLPDA is divided into two stages, i.e., one is to solve the optimal expansion of the data-dependent kernel with the proposed kernel self-optimization method, and the second is to seek the optimal projection matrix for dimensionality reduction. Since the optimal parameters of data-dependent kernel are achieved automatically through solving the constraint optimization equation, based on maximum margin criterion and Fisher criterion in the empirical feature space, KSLPDA works well on feature extraction for classification. The comparative experiments show that KSLPDA outperforms PCA, LDA, LPP, supervised LPP and kernel supervised LPP.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,