Article ID Journal Published Year Pages File Type
534983 Pattern Recognition Letters 2008 7 Pages PDF
Abstract

We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher’s discriminant (FD), and a heteroscedastic LDR scheme, Loog–Duin (LD). We formalize the necessary and sufficient conditions for which the FD and LD criteria are maximized for the same linear transformation. To derive these conditions, we first show that the two criteria preserve the same maximum values after a diagonalization process is applied. We derive the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. We empirically show that the conditions are statistically related to the classification error for a post-processing one-dimensional quadratic classifier and the Chernoff distance in the transformed space.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,