September 22-25, 1997
Recent progress in variable n-gram language modeling provides an efficient representation of n-gram models and makes training of higher order n-grams possible. In this paper, we apply the variable n-gram design algorithm to conversational speech, extending the algorithm to learn skips and classes in context to handle conversational speech characteristics such as repetitions and dis uency markers. We show that using the extended variable n-gram, we can build a language model that uses fewer parameters for longer context and improves the test perplexity and recognition accuracy.
Bibliographic reference. Siu, Manhung / Ostendorf, Mari (1997): "Variable n-gram language modeling and extensions for conversational speech", In EUROSPEECH-1997, 2739-2742.