Article ID Journal Published Year Pages File Type
564322 Signal Processing 2010 8 Pages PDF
Abstract

Kernel discriminant analysis (KDA) is an effective statistical method for dimensionality reduction and feature extraction. However, traditional KDA methods suffer from the small sample size problem. Moreover, they endure the Fisher criterion that is nonoptimal with respect to classification rate. This paper presents a variant of KDA that deals with both of the shortcomings in an efficient and cost effective manner. The key to the approach is to use simultaneous diagonalization technique for optimization and meanwhile utilize a modified Fisher criterion that it is more closely related to classification error. Extensive experiments on face recognition task show that the proposed method is an effective nonlinear feature extractor.

Related Topics
Physical Sciences and Engineering Computer Science Signal Processing
Authors
, ,