کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
404690 677442 2008 8 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Classifier performance estimation under the constraint of a finite sample size: Resampling schemes applied to neural network classifiers
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
Classifier performance estimation under the constraint of a finite sample size: Resampling schemes applied to neural network classifiers
چکیده انگلیسی

In a practical classifier design problem the sample size is limited, and the available finite sample needs to be used both to design a classifier and to predict the classifier’s performance for the true population. Since a larger sample is more representative of the population, it is advantageous to design the classifier with all the available cases, and to use a resampling technique for performance prediction. We conducted a Monte Carlo simulation study to compare the ability of different resampling techniques in predicting the performance of a neural network (NN) classifier designed with the available sample. We used the area under the receiver operating characteristic curve as the performance index for the NN classifier. We investigated resampling techniques based on the cross-validation, the leave-one-out method, and three different types of bootstrapping, namely, the ordinary, .632, and .632+ bootstrap. Our results indicated that, under the study conditions, there can be a large difference in the accuracy of the prediction obtained from different resampling methods, especially when the feature space dimensionality is relatively large and the sample size is small. Although this investigation is performed under some specific conditions, it reveals important trends for the problem of classifier performance prediction under the constraint of a limited data set.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 21, Issues 2–3, March–April 2008, Pages 476–483
نویسندگان
, , ,