14thAnnual Conference of the International Speech Communication Association

Lyon, France
August 25-29, 2013

Multi-Domain Neural Network Language Model

Tanel Alumäe

Tallinn University of Technology, Estonia

The paper describes a neural network language model that jointly models language in many related domains. In addition to the traditional layers of a neural network language model, the proposed model also trains a vector of factors for each domain in the training data that are used to modulate the connections from the projection layer to the hidden layer. The model is found to outperform simple neural network language models as well as domain-adapted maximum entropy language models in perplexity evaluation and speech recognition experiments.

Full Paper

Bibliographic reference.  Alumäe, Tanel (2013): "Multi-domain neural network language model", In INTERSPEECH-2013, 2182-2186.