Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
531568 | Pattern Recognition | 2008 | 15 Pages |
Linear dimensionality reduction (LDR) techniques are quite important in pattern recognition due to their linear time complexity and simplicity. In this paper, we present a novel LDR technique which, though linear, aims to maximize the Chernoff distance in the transformed space; thus, augmenting the class separability in such a space. We present the corresponding criterion, which is maximized via a gradient-based algorithm, and provide convergence and initialization proofs. We have performed a comprehensive performance analysis of our method combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data, and compared it with other LDR techniques. The results on synthetic and standard real-life data sets show that the proposed criterion outperforms the latter when combined with both linear and quadratic classifiers.