TheanoLM — An Extensible Toolkit for Neural Network Language Modeling

Seppo Enarvi, Mikko Kurimo


We present a new tool for training neural network language models (NNLMs), scoring sentences, and generating text. The tool has been written using Python library Theano, which allows researcher to easily extend it and tune any aspect of the training process. Regardless of the flexibility, Theano is able to generate extremely fast native code that can utilize a GPU or multiple CPU cores in order to parallelize the heavy numerical computations. The tool has been evaluated in difficult Finnish and English conversational speech recognition tasks, and significant improvement was obtained over our best back-off n-gram models. The results that we obtained in the Finnish task were compared to those from existing RNNLM and RWTHLM toolkits, and found to be as good or better, while training times were an order of magnitude shorter.


DOI: 10.21437/Interspeech.2016-618

Cite as

Enarvi, S., Kurimo, M. (2016) TheanoLM — An Extensible Toolkit for Neural Network Language Modeling. Proc. Interspeech 2016, 3052-3056.

Bibtex
@inproceedings{Enarvi+2016,
author={Seppo Enarvi and Mikko Kurimo},
title={TheanoLM — An Extensible Toolkit for Neural Network Language Modeling},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-618},
url={http://dx.doi.org/10.21437/Interspeech.2016-618},
pages={3052--3056}
}