کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
382971 660799 2016 7 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Deep learning with adaptive learning rate using laplacian score
ترجمه فارسی عنوان
یادگیری عمیق با میزان یادگیری انطباقی با استفاده از نمره لاپلاس
کلمات کلیدی
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی


• Adaptive learning rate has been proposed for Deep Learning in MLP.
• Technique for updating learning rate is free of hyper-parameters.
• Learning rate is a function of a parameter called learning parameter.
• Learning parameter is updated based on error gradient.
• Learning rate is further updated based on the Laplacian score of activation values.

An attempt has been made to improve the performance of Deep Learning with Multilayer Perceptron (MLP). Tuning the learning rate or finding an optimum learning rate in MLP is a major challenge. Depending on the value of the learning rate, classification accuracy can vary drastically. This issue has been taken as a challenge in this paper. In this paper, a new approach has been proposed to combine adaptive learning rate in conjunction with the concept of Laplacian score for varying the weights. Learning rate is taken as a function of parameter which itself is updated on the basis of error gradient by forming mini-batches. Laplacian score of the neuron is further used for updating the incoming weights. This removes the bottleneck involved in finding the optimum value for the learning rate in Deep Learning by using MLP. It is observed on benchmark datasets that this approach leads to increase in classification accuracy as compared to the existing benchmark levels achieved by the well known methods of deep learning.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Expert Systems with Applications - Volume 63, 30 November 2016, Pages 1–7
نویسندگان
, ,