کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4946462 1439291 2016 30 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
A regularized root-quartic mixture of experts for complex classification problems
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
A regularized root-quartic mixture of experts for complex classification problems
چکیده انگلیسی
Mixture of experts is a neural network based ensemble learning approach consisting of several experts and a gating network. In this paper, we introduce regularized root-quartic mixture of experts (R-RTQRT-ME) by incorporating a regularization term into the error function to control the complexity of model and to increase robustness in confronting with over-fitting and noise. The average of the results of R-RTQRT-ME on 20 classification benchmark datasets, shows that this algorithm performs 1.75%, 2.50%, 2.29% better than multi objective regularized negative correlation learning, multi objective negative correlation learning and multi objective neural network, respectively. Also, the average of improvements of R-RTQRT-ME is 1.16%, 2.31%, 3.40%, 3.39% in comparison with root-quartic mixture of experts, mixture of negatively correlated experts, mixture of experts and negative correlation learning, respectively. Furthermore, the effect of the regularization penalty term in R-RTQRT-ME on noisy data is analyzed which shows the robustness of R-RTQRT-ME in these situations.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Knowledge-Based Systems - Volume 110, 15 October 2016, Pages 98-109
نویسندگان
, , ,