کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
403462 677236 2016 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
On the importance of sluggish state memory for learning long term dependency
ترجمه فارسی عنوان
در مورد اهمیت حافظه کشنده دولت برای یادگیری وابستگی طولانی مدت
کلمات کلیدی
شبکه های مجازی ساده از بین رفتن مشکل شیب، شبکه دولتی اکو، وظیفه پیش بینی گرامری، فضای حالت آرام، نمایندگی داخلی
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی

The vanishing gradients problem inherent in Simple Recurrent Networks (SRN) trained with back-propagation, has led to a significant shift towards the use of Long Short-Term Memory (LSTM) and Echo State Networks (ESN), which overcome this problem through either second order error-carousel schemes or different learning algorithms, respectively.This paper re-opens the case for SRN-based approaches, by considering a variant, the Multi-recurrent Network (MRN). We show that memory units embedded within its architecture can ameliorate against the vanishing gradient problem, by providing variable sensitivity to recent and more historic information through layer- and self-recurrent links with varied weights, to form a so-called sluggish state-based memory.We demonstrate that an MRN, optimised with noise injection, is able to learn the long term dependency within a complex grammar induction task, significantly outperforming the SRN, NARX and ESN. Analysis of the internal representations of the networks, reveals that sluggish state-based representations of the MRN are best able to latch on to critical temporal dependencies spanning variable time delays, to maintain distinct and stable representations of all underlying grammar states. Surprisingly, the ESN was unable to fully learn the dependency problem, suggesting the major shift towards this class of models may be premature.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Knowledge-Based Systems - Volume 96, 15 March 2016, Pages 104–114
نویسندگان
, , ,