Article ID Journal Published Year Pages File Type
411469 Neurocomputing 2016 6 Pages PDF
Abstract

K nearest neighbors (kNN) is an efficient lazy learning algorithm and has successfully been developed in real applications. It is natural to scale the kNN method to the large scale datasets. In this paper, we propose to first conduct a k-means clustering to separate the whole dataset into several parts, each of which is then conducted kNN classification. We conduct sets of experiments on big data and medical imaging data. The experimental results show that the proposed kNN classification works well in terms of accuracy and efficiency.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,