Article ID Journal Published Year Pages File Type
411964 Neurocomputing 2015 9 Pages PDF
Abstract

Compared to linear discriminant analysis (LDA), its orthogonalized version is a more effective statistical learning tool for dimension reduction, which devotes to better separating the data points from different classes in the lower-dimensional subspace. However, existing orthogonalized LDA techniques suffer from various drawbacks, including the requirement for expensive computing time. This paper develops an efficient orthogonal dimension reduction approach, referred to as fast orthogonal linear discriminant analysis (FOLDA), which is based on existing orthogonal linear discriminant analysis (OLDA) algorithms. However, different from previous efforts, the new approach applies the QR decomposition and the regression to solve for a new orthogonal projection vector at each iteration, leading to the by far cheaper computational cost. FOLDA achieves comparable recognition rate to existing OLDA algorithms due to the incorporation of the idea and spirit behind the latter ones. Experimental results on image databases, such as MINST, COIL20, MEPG-7 and OUTEX, show the effectiveness and efficiency of our algorithm.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,