Article ID Journal Published Year Pages File Type
535003 Pattern Recognition Letters 2016 7 Pages PDF
Abstract

•Our method finds neighbors without any parameter.•Our method can handle manifold clusters and noises.•The neighbor number of each data is dynamic.•Most clusters and outliers are easy to identify by their natural neighbor.•Natural Neighbor Eigenvalue is a better choice of k in traditional KNN.

K-nearest neighbor (KNN) and reverse k-nearest neighbor (RkNN) are two bases of many well-established and high-performance pattern-recognition techniques, but both of them are vulnerable to their parameter choice. Essentially, the challenge is to detect the neighborhood of various data sets, while utterly ignorant of the data characteristic. In this paper, a novel concept in terms of nearest neighbor is proposed and named natural neighbor (NaN). In contrast to KNN and RkNN, it is a scale-free neighbor, and it can reflect a better data characteristics. This article discusses the theoretical model and applications of natural neighbor in a different field, and we demonstrate the improvement of the proposed neighborhood on both synthetic and real-world data sets.

Graphical abstractFigure optionsDownload full-size imageDownload high-quality image (174 K)Download as PowerPoint slide

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,