This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to com- bine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.
Cite as: Zhang, R., Black, E., Finch, A., Sagisaka, Y. (2000) A tagger-aided language model with a stack decoder. Proc. 6th International Conference on Spoken Language Processing (ICSLP 2000), vol. 1, 250-253, doi: 10.21437/ICSLP.2000-62
@inproceedings{zhang00b_icslp, author={Ruiqiang Zhang and Ezra Black and Andrew Finch and Yoshinori Sagisaka}, title={{A tagger-aided language model with a stack decoder}}, year=2000, booktitle={Proc. 6th International Conference on Spoken Language Processing (ICSLP 2000)}, pages={vol. 1, 250-253}, doi={10.21437/ICSLP.2000-62} }