Article ID Journal Published Year Pages File Type
848990 Optik - International Journal for Light and Electron Optics 2014 7 Pages PDF
Abstract

An improvement to the nearest neighbor classifier (INNC) has shown its excellent classification performance on some classification tasks. However, it is not very clearly known why INNC is able to obtain good performance and what the underlying classification mechanism is. Moreover, INNC cannot classify low-dimensional data well and some high-dimensional data in which sample vectors belonging to different class distribution but have the same vector direction. In order to solve these problems, this paper proposes a novel classification method, named kernel representation-based nearest neighbor classifier (KRNNC), which can not only remedy the drawback of INNC on low-dimensional data, but also obtain competitive classification results on high-dimensional data. We reveal the underlying classification mechanism of KRNNC in details, which can also be regarded as a theoretical supplement of INNC. We first implicitly map all samples into a kernel feature space by using a nonlinear mapping associated with a kernel function. Then, we represent a test sample as a linear combination of all training samples and use the representation ability to perform classification. From the way of classifying test samples, KRNNC can be regarded as the nonlinear extension of INNC. Extensive experimental studies on benchmark datasets and face image databases show the effectiveness of KRNNC.

Related Topics
Physical Sciences and Engineering Engineering Engineering (General)
Authors
, , , , ,