We present context-sensitive dynamic classes - a novel mechanism for integrating contextual information from spoken dialogue into a class n-gram language model. We exploit the dialogue system's information state to populate dynamic classes, thus percolating contextual constraints to the recognizer's language model in real time. We describe a technique for training a language model incorporating context-sensitive dynamic classes which considerably reduces word error rate under several conditions. Significantly, our technique does not partition the language model based on potentially artificial dialogue state distinctions; rather, it accommodates both strong and weak expectations via dynamic manipulation of a single model.
Cite as: Gruenstein, A., Wang, C., Seneff, S. (2005) Context-sensitive statistical language modeling. Proc. Interspeech 2005, 17-20, doi: 10.21437/Interspeech.2005-7
@inproceedings{gruenstein05_interspeech, author={Alexander Gruenstein and Chao Wang and Stephanie Seneff}, title={{Context-sensitive statistical language modeling}}, year=2005, booktitle={Proc. Interspeech 2005}, pages={17--20}, doi={10.21437/Interspeech.2005-7} }