Article ID Journal Published Year Pages File Type
530269 Pattern Recognition 2015 13 Pages PDF
Abstract

•We propose the Iterative Nearest Neighbors (INN) representation and its refined variant.•INN proves on par or better performance with MP and OMP on sparse signal recovery task.•We derive INN-based dimensionality reduction and classification methods.•We use face (AR), traffic sign (GTSRB), scene (Scene-15) and obj. class (Pascal VOC) benchmarks.•INN proves performance on par with SR but running times closer to NN, for low dimensional data.

Representing data as a linear combination of a set of selected known samples is of interest for various machine learning applications such as dimensionality reduction or classification. k-Nearest Neighbors (k NN) and its variants are still among the best-known and most often used techniques. Some popular richer representations are Sparse Representation (SR) based on solving an l1-regularized least squares formulation, Collaborative Representation (CR) based on l2-regularized least squares, and Locally Linear Embedding (LLE) based on an l1-constrained least squares problem. We propose a novel sparse representation, the Iterative Nearest Neighbors (INN). It combines the power of SR and LLE with the computational simplicity of k NN. We empirically validate our representation in terms of sparse support signal recovery and compare with similar Matching Pursuit (MP) and Orthogonal Matching Pursuit (OMP), two other iterative methods. We also test our method in terms of dimensionality reduction and classification, using standard benchmarks for faces (AR), traffic signs (GTSRB), and objects (PASCAL VOC 2007). INN compares favorably to NN, MP, and OMP, and on par with CR and SR, while being orders of magnitude faster than the latter. On the downside, INN does not scale well with higher dimensionalities of the data.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,