Sixth International Conference on Spoken Language Processing (ICSLP 2000)

Beijing, China
October 16-20, 2000

A Tagger-Aided Language Model with a Stack Decoder

Ruiqiang Zhang, Ezra Black, Andrew Finch, Yoshinori Sagisaka

ATR Spoken Language Translation Laboratories, Soraku-gun, Kyoto, Japan

This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to com- bine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.


Full Paper

Bibliographic reference.  Zhang, Ruiqiang / Black, Ezra / Finch, Andrew / Sagisaka, Yoshinori (2000): "A tagger-aided language model with a stack decoder", In ICSLP-2000, vol.1, 250-253.