Sixth European Conference on Speech Communication and Technology
(EUROSPEECH'99)

Budapest, Hungary
September 5-9, 1999

Using Detailed Linguistic Structure in Language Modelling

Ruiqiang Zhang, Ezra Black, Andrew Finch

ATR Interpreting Telecommunications Laboratories, Seika-cho, Soraku-gun, Kyoto, Japan

Recently, considerable attention has been accorded to attempts to apply natural language processing techniques to language modelling for speech recognition. Another extension to the standard n-gram technique has been the use of trigger-pair predictors. In the present experiments, we incorporate into language models, information derived from detailed syntactic and semantic parses and taggings. We use a human expert to define the interesting features of the history, and these are formalized as triggers and integrated with a trigram language model using the maximum entropy framework. We select maximum entropy because it provides a convenient method of combining multiple information sources. We employ two different kinds of triggering events: those based on a knowledge of the full parse of the previous sentences in the document, and those based on knowledge of the syntactic/semantic tags to the left of and in the same sentence as the word being predicted. We contrast results obtained using these events plus a baseline n{gram language model, both with the baseline model itself, and with the baseline model plus triggers based on word triggers chosen automatically. Mutual information selects the best trigger pairs from all candidates generated by combining each of these triggering events with every word in the vocabulary. The grammar and tagset used to express linguistic information about English are unusually detailed. The tagset contains some 3,000 syntactic/semantic tags. Using a 200-million-word training set composed of Wall Street Journal and Associated Press newswire text we reduced test-set perplexity by 11.3% as against the baseline model. Further, our method when combined with long{distance word triggers reduced test-set perplexity by 21.7%.


Full Paper (PDF)   Gnu-Zipped Postscript

Bibliographic reference.  Zhang, Ruiqiang / Black, Ezra / Finch, Andrew (1999): "Using detailed linguistic structure in language modelling", In EUROSPEECH'99, 1815-1818.