کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
535851 870392 2012 7 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
New rank methods for reducing the size of the training set using the nearest neighbor rule
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله
New rank methods for reducing the size of the training set using the nearest neighbor rule
چکیده انگلیسی

Some new rank methods to select the best prototypes from a training set are proposed in this paper in order to establish its size according to an external parameter, while maintaining the classification accuracy. The traditional methods that filter the training set in a classification task like editing or condensing have some rules that apply to the set in order to remove outliers or keep some prototypes that help in the classification. In our approach, new voting methods are proposed to compute the prototype probability and help to classify correctly a new sample. This probability is the key to sorting the training set out, so a relevance factor from 0 to 1 is used to select the best candidates for each class whose accumulated probabilities are less than that parameter. This approach makes it possible to select the number of prototypes necessary to maintain or even increase the classification accuracy. The results obtained in different high dimensional databases show that these methods maintain the final error rate while reducing the size of the training set.


► New rank methods to select the best prototypes from a training set are proposed.
► Its compute the prototype probability and help to classify correctly a new sample.
► A relevance factor from 0 to 1 is used to select the best candidates.
► The results show how to keep the final error rate while reducing the size of the training set.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Pattern Recognition Letters - Volume 33, Issue 5, 1 April 2012, Pages 654–660
نویسندگان
, ,