کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
8960116 | 1646381 | 2018 | 20 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Parsimonious memory unit for recurrent neural networks with application to natural language processing
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
هوش مصنوعی
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
Recurrent Neural Networks (RNN) receive an important interest from Artificial Intelligence researches (AI) this last decade due to their high capability to learn complex internal structures to expose relevant information. However, RNNs fail to reveal long-term dependencies and new RNN with gates have been proposed to address this drawback such as Long Short-Term Memory (LSTM). This RNN-based model requires 4 gates to learn both short and long-term dependencies for a given sequence of basic elements. Recently, a new family of RNN called “Gated Recurrent Unit” has been introduced. The GRU contains few gates (reset and update gates) but is based on gates grouping without taking into account the latent relations between short and long-term dependencies. The GRU term dependencies management through hidden units is therefore similar for all hidden neurons. Moreover, the learning of gated RNNs requires a large amount of data and, despite the advent of GPU cards that allow the model to be learned quicker, the processing time is quite costly. This paper proposes a new RNN called “Parsimonious Memory Unit” (PMU) based on the strong assumption that short and long-term dependencies are related and that the role of each hidden neuron has to be different to better handle term dependencies. Experiments conduced on both a small (short-term) spoken dialogues data set from the DECODA project, a large (long-term) textual document corpus from the 20-Newsgroups and a language modeling task, show that the proposed PMU-RNN reaches similar, even better performances (efficiency) with less processing time (improve portability) with a gain of 50%. Moreover, the experiments on the gates' activity show that the proposed PMU manages better term dependencies than the GRU-RNN model.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 314, 7 November 2018, Pages 48-64
Journal: Neurocomputing - Volume 314, 7 November 2018, Pages 48-64
نویسندگان
Mohamed Morchid,