We present a novel toolkit that implements the long short-term memory (LSTM)
neural network concept for language modeling. The main goal is to provide a
software which is easy to use, and which allows fast training of standard recurrent
and LSTM neural network language models.
The toolkit obtains state-of-the-art performance on the standard Treebank corpus. To reduce the training time, BLAS and related libraries are supported, and it is possible to evaluate multiple word sequences in parallel. In addition, arbitrary word classes can be used to speed up the computation in case of large vocabulary sizes.
Finally, the software allows easy integration with SRILM, and it supports direct decoding and rescoring of HTK lattices. The toolkit is available for download under an open source license.
Bibliographic reference. Sundermeyer, Martin / Schlüter, Ralf / Ney, Hermann (2014): "rwthlm — the RWTH aachen university neural network language modeling toolkit", In INTERSPEECH-2014, 2093-2097.