کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
412072 679608 2015 8 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
An oscillation bound of the generalization performance of extreme learning machine and corresponding analysis
ترجمه فارسی عنوان
یک محدوده نوسان از عملکرد تعمیم دستگاه یادگیری افراطی و آنالیز متناظر
کلمات کلیدی
دستگاه یادگیری شدید دامنه نوسان عملکرد عمومی سازی، تحقیق نظری، گره های پنهان بی نهایت
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی

Extreme Learning Machine (ELM), proposed by Huang et al. in 2004 for the first time, performs better than traditional learning machines such as BP networks and SVM in some applications. This paper attempts to give an oscillation bound of the generalization performance of ELM and a reason why ELM is not sensitive to the number of hidden nodes, which are essential open problems proposed by Huang et al. in 2011. The derivation of the bound is in the framework of statistical learning theory and under the assumption that the expectation of the ELM kernel exists. It turns out that our bound is consistent with the experimental results about ELM obtained before and predicts that overfitting can be avoided even when the number of hidden nodes approaches infinity. The prediction is confirmed by our experiments on 15 data sets using one kind of activation function with every parameter independently drawn from the same Guasssian distribution, which satisfies the assumption above. The experiments also showed that when the number of hidden nodes approaches infinity, the ELM kernel with the activation is insensitive to the kernel parameter.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 151, Part 2, 5 March 2015, Pages 883–890
نویسندگان
, , ,