4th International Conference on Spoken Language Processing
Philadelphia, PA, USA
Previous attempts to automatically determine multi-words as the basic unit for language modeling have been successful for extending bigram models [10, 9, 2, 8] to improve the perplexity of the language model and/or the word accuracy of the speech decoder. However, none of these techniques gave improvements over the trigram model so far, except for the rather controlled ATIS task . We therefore propose an algorithm, that minimizes the perplexity improvement of a bigram model directly. The new algorithm is able to reduce the trigram perplexity and also achieves word accuracy improvements in the Verbmobil task. It is the natural counterpart of successful word classification algorithms for language modeling [4, 7] that minimize the leaving-one-out bigram perplexity. We also give some details on the usage of class finding techniques and m-gram models, which can be crucial to successful applications of this technique.
Bibliographic reference. Ries, Klaus / Buo, Finn Dag / Waibel, Alex (1996): "Class phrase models for language modelling", In ICSLP-1996, 398-401.