LatticeRnn: Recurrent Neural Networks Over Lattices

Faisal Ladhak, Ankur Gandhe, Markus Dreyer, Lambert Mathias, Ariya Rastrow, Björn Hoffmeister


We present a new model called LatticeRnn, which generalizes recurrent neural networks (RNNs) to process weighted lattices as input, instead of sequences. A LatticeRnn can encode the complete structure of a lattice into a dense representation, which makes it suitable to a variety of problems, including rescoring, classifying, parsing, or translating lattices using deep neural networks (DNNs). In this paper, we use LatticeRnns for a classification task: each lattice represents the output from an automatic speech recognition (ASR) component of a spoken language understanding (SLU) system, and we classify the intent of the spoken utterance based on the lattice embedding computed by a LatticeRnn. We show that making decisions based on the full ASR output lattice, as opposed to 1-best or n-best hypotheses, makes SLU systems more robust to ASR errors. Our experiments yield improvements of 13% over a baseline RNN system trained on transcriptions and 10% over an n-best list rescoring system for intent classification.


DOI: 10.21437/Interspeech.2016-1583

Cite as

Ladhak, F., Gandhe, A., Dreyer, M., Mathias, L., Rastrow, A., Hoffmeister, B. (2016) LatticeRnn: Recurrent Neural Networks Over Lattices. Proc. Interspeech 2016, 695-699.

Bibtex
@inproceedings{Ladhak+2016,
author={Faisal Ladhak and Ankur Gandhe and Markus Dreyer and Lambert Mathias and Ariya Rastrow and Björn Hoffmeister},
title={LatticeRnn: Recurrent Neural Networks Over Lattices},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-1583},
url={http://dx.doi.org/10.21437/Interspeech.2016-1583},
pages={695--699}
}