کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534778 | 870288 | 2012 | 11 صفحه PDF | دانلود رایگان |
Though the k-nearest neighbor (k-NN) pattern classifier is an effective learning algorithm, it can result in large model sizes. To compensate, a number of variant algorithms have been developed that condense the model size of the k-NN classifier at the expense of accuracy. To increase the accuracy of these condensed models, we present a direct boosting algorithm for the k-NN classifier that creates an ensemble of models with locally modified distance weighting. An empirical study conducted on 10 standard databases from the UCI repository shows that this new Boosted k-NN algorithm has increased generalization accuracy in the majority of the datasets and never performs worse than standard k-NN.
► We incorporate boosting directly into the kNN algorithm.
► This provides a mechanism for naturally building ensembles of NN-based learners.
► The approach is shown to improve on traditional kNN in many instances (and to never be worse).
► The approach naturally makes kNN more robust to class imbalance.
► The approach can be practically scaled without adversely affecting performance.
Journal: Pattern Recognition Letters - Volume 33, Issue 1, 1 January 2012, Pages 92–102