ISCA Archive Interspeech 2013
ISCA Archive Interspeech 2013

Multi-domain neural network language model

Tanel Alumäe

The paper describes a neural network language model that jointly models language in many related domains. In addition to the traditional layers of a neural network language model, the proposed model also trains a vector of factors for each domain in the training data that are used to modulate the connections from the projection layer to the hidden layer. The model is found to outperform simple neural network language models as well as domain-adapted maximum entropy language models in perplexity evaluation and speech recognition experiments.


doi: 10.21437/Interspeech.2013-515

Cite as: Alumäe, T. (2013) Multi-domain neural network language model. Proc. Interspeech 2013, 2182-2186, doi: 10.21437/Interspeech.2013-515

@inproceedings{alumae13b_interspeech,
  author={Tanel Alumäe},
  title={{Multi-domain neural network language model}},
  year=2013,
  booktitle={Proc. Interspeech 2013},
  pages={2182--2186},
  doi={10.21437/Interspeech.2013-515}
}