Article ID Journal Published Year Pages File Type
534801 Pattern Recognition Letters 2011 7 Pages PDF
Abstract

There is a great interest in dimensionality reduction techniques for tackling the problem of high-dimensional pattern classification. This paper addresses the topic of supervised learning of a linear dimension reduction mapping suitable for classification problems. The proposed optimization procedure is based on minimizing an estimation of the nearest neighbor classifier error probability, and it learns a linear projection and a small set of prototypes that support the class boundaries. The learned classifier has the property of being very computationally efficient, making the classification much faster than state-of-the-art classifiers, such as SVMs, while having competitive recognition accuracy. The approach has been assessed through a series of experiments, showing a uniformly good behavior, and competitive compared with some recently proposed supervised dimensionality reduction techniques.

Research highlights► Introduced a more elegant formulation for LDPP algorithm which leads to a more efficient and easily parallelizable implementation. ► The LDPP has been modified to ensure that the resulting projection matrix is orthonormal, and experimental results confirm this benefits the recognition performance. ► It is shown that the LDPP approach behaves considerably well for a wide range of problems. It achieves very competitive results for supervised dimensionality reduction, comparable to state-of-the-art techniques. ► The results on high-dimensional problems show that unlike other techniques, LDPP obtains competitive recognition performance when applied to the original feature space and without having to resort to a PCA preprocessing.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,