کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
404900 677462 2006 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Trainable fusion rules. I. Large sample size case
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
Trainable fusion rules. I. Large sample size case
چکیده انگلیسی

A wide selection of standard statistical pattern classification algorithms can be applied as trainable fusion rules while designing neural network ensembles. A focus of the present two-part paper is finite sample effects: the complexity of base classifiers and fusion rules; the type of outputs provided by experts to the fusion rule; non-linearity of the fusion rule; degradation of experts and the fusion rule due to the lack of information in the design set; the adaptation of base classifiers to training set size, etc. In the first part of this paper, we consider arguments for utilizing continuous outputs of base classifiers versus categorical outputs and conclude: if one succeeds in having a small number of expert networks working perfectly in different parts of the input feature space, then crisp outputs may be preferable over continuous outputs. Afterwards, we oppose fixed fusion rules versus trainable ones and demonstrate situations where weighted average fusion can outperform simple average fusion. We present a review of statistical classification rules, paying special attention to these linear and non-linear rules, which are employed rarely but, according to our opinion, could be useful in neural network ensembles. We consider ideal and sample-based oracle decision rules and illustrate characteristic features of diverse fusion rules by considering an artificial two-dimensional (2D) example where the base classifiers perform well in different regions of input feature space.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neural Networks - Volume 19, Issue 10, December 2006, Pages 1506–1516
نویسندگان
,