ISCA Archive ICSLP 1998
ISCA Archive ICSLP 1998

Log-linear interpolation of language models

Dietrich Klakow

Combining different language models is an important task. Linear interpolation is the established method to do this. We will present a new method called log-linear interpolation (LLI) which combines the simplicity of linear interpolation with essential parts from maximum-entropy models by linearly interpolating the scores of different models. The first series of experiments focuses on adaptation. Unigram, bigram and trigram models trained on NAB are combined with unigram and bigram models trained on a small domain specific corpus. LLI compares favorably with linear interpolation. The second series combines bigram and distance-bigram models. Here, relative improvements are larger (~20% in perplexity). This task seems to be the ideal application of LLI. To further scrutinize the method, first frequent pairs of words are joined to phrases and next, bigram and distance-bigram are combined by LLI. This experiment yields perplexities just 2.5% above the original trigram perplexity.


doi: 10.21437/ICSLP.1998-623

Cite as: Klakow, D. (1998) Log-linear interpolation of language models. Proc. 5th International Conference on Spoken Language Processing (ICSLP 1998), paper 0522, doi: 10.21437/ICSLP.1998-623

@inproceedings{klakow98_icslp,
  author={Dietrich Klakow},
  title={{Log-linear interpolation of language models}},
  year=1998,
  booktitle={Proc. 5th International Conference on Spoken Language Processing (ICSLP 1998)},
  pages={paper 0522},
  doi={10.21437/ICSLP.1998-623}
}