Sixth European Conference on Speech Communication and Technology

Budapest, Hungary
September 5-9, 1999

On the Use of Right Context in Sense-Disambiguating Language Models

Vincent Chow, Dekai Wu

Department of Computer Science, HKUST, Clear Water Bay, Hong Kong

We investigate the utility of right-context (look-ahead information) in incremental left-to-right lan-\linebreak guage models with word sense disambiguation, and discover somewhat unexpectedly that using right-context in addition to left-context (history) may actually \textit{reduce} accuracy. We employ word sense disambiguation as one component of a language model designed to allow hypothesis to be evaluated incrementally. In our baseline system, disambiguation is performed by a na\"{\i}ve-Bayes classifier that uses lexical co-occurrence features from the history. We then augment the left-context only model with three well-motivated methods using the right-context. Perhaps surprisingly, experiment results with the three look-ahead strategies shown a 0.19\% up to 10.04\% \textit{decrease} in the accuracy of disambiguating the next word.

Full Paper (PDF)   Gnu-Zipped Postscript

Bibliographic reference.  Chow, Vincent / Wu, Dekai (1999): "On the use of right context in sense-disambiguating language models", In EUROSPEECH'99, 1571-1574.