INTERSPEECH 2013
14thAnnual Conference of the International Speech Communication Association

Lyon, France
August 25-29, 2013

Prefix Tree Based N-Best List Re-Scoring for Recurrent Neural Network Language Model Used in Speech Recognition System

Yujing Si, Qingqing Zhang, Ta Li, Jielin Pan, Yonghong Yan

Chinese Academy of Sciences, China

Recurrent Neural Network Language Model (RNNLM) has recently been shown to outperform N-gram Language Models (LM) as well as many other competing advanced LM techniques. However, the training and testing of RNNLM are very time-consuming, so in realtime recognition systems, RNNLM is usually used for re-scoring a limited size of n-best list. In this paper, issues of speeding up RNNLM are explored when RNNLMs are used to re-rank a large nbest list. A new n-best list re-scoring framework, Prefix Tree based N-best list Rescoring (PTNR), is proposed to completely get rid of the redundant computations which make re-scoring ineffective. At the same time, the bunch mode technique, widely used for speeding up the training of feed-forward neural network language model, is investigated to combine with PTNR to further improve the rescoring speed. Experimental results showed that our proposed re-scoring approach for RNNLM was much faster than the standard n-best list re-scoring. Take 1000-best as an example, our approach was almost 11 times faster than the standard n-best list re-scoring.

Full Paper

Bibliographic reference.  Si, Yujing / Zhang, Qingqing / Li, Ta / Pan, Jielin / Yan, Yonghong (2013): "Prefix tree based n-best list re-scoring for recurrent neural network language model used in speech recognition system", In INTERSPEECH-2013, 3419-3423.