International Workshop on Spoken Language Translation (IWSLT) 2012

Hong Kong
December 6-7, 2012

Continuous Space Language Models using Restricted Boltzmann Machines

Jan Niehues, Alex Waibel

International Center for Advanced Communication Technologies - InterACT, Institute for Anthropomatics, Karlsruhe Institute of Technology, Germany

We present a novel approach for continuous space language models in statistical machine translation by using Restricted Boltzmann Machines (RBMs). The probability of an n-gram is calculated by the free energy of the RBM instead of a feedforward neural net. Therefore, the calculation is much faster and can be integrated into the translation process instead of using the language model only in a re-ranking step.
    Furthermore, it is straightforward to introduce additional word factors into the language model. We observed a faster convergence in training if we include automatically generated word classes as an additional word factor.
    We evaluated the RBM-based language model on the German to English and English to French translation task of TED lectures. Instead of replacing the conventional n-grambased language model, we trained the RBM-based language model on the more important but smaller in-domain data and combined them in a log-linear way. With this approach we could show improvements of about half a BLEU point on the translation task.

Full Paper    Presentation

Bibliographic reference.  Niehues, Jan / Waibel, Alex (2012): "Continuous space language models using restricted Boltzmann machines", In IWSLT-2012, 164-170.