Internal Memory Gate for Recurrent Neural Networks with Application to Spoken Language Understanding

Mohamed Morchid


Long Short-Term Memory (LSTM) Recurrent Neural Networks (RNN) require 4 gates to learn short- and long-term dependencies for a given sequence of basic elements. Recently, “Gated Recurrent Unit” (GRU) has been introduced and requires fewer gates than LSTM (reset and update gates), to code short- and long-term dependencies and reaches equivalent performances to LSTM, with less processing time during the learning. The “Leaky integration Unit” (LU) is a GRU with a single gate (update) that codes mostly long-term dependencies quicker than LSTM or GRU (small number of operations for learning). This paper proposes a novel RNN that takes advantage of LSTM, GRU (short- and long-term dependencies) and the LU (fast learning) called “Internal Memory Gate” (IMG). The effectiveness and the robustness of the proposed IMG-RNN is evaluated during a classification task of a small corpus of spoken dialogues from the DECODA project that allows us to evaluate the capability of each RNN to code short-term dependencies. The experiments show that IMG-RNNs reach better accuracies with a gain of 0.4 points compared to LSTM- and GRU-RNNs and 0.7 points compared to the LU-RNN. Moreover, IMG-RNN requires less processing time than GRU or LSTM with a gain of 19% and 50% respectively.


 DOI: 10.21437/Interspeech.2017-357

Cite as: Morchid, M. (2017) Internal Memory Gate for Recurrent Neural Networks with Application to Spoken Language Understanding. Proc. Interspeech 2017, 3316-3319, DOI: 10.21437/Interspeech.2017-357.


@inproceedings{Morchid2017,
  author={Mohamed Morchid},
  title={Internal Memory Gate for Recurrent Neural Networks with Application to Spoken Language Understanding},
  year=2017,
  booktitle={Proc. Interspeech 2017},
  pages={3316--3319},
  doi={10.21437/Interspeech.2017-357},
  url={http://dx.doi.org/10.21437/Interspeech.2017-357}
}