The classical dynamic programming recursions for the forwards-backwards and Viterbi HMM algorithms are linear in the number of time frames being processed. Adapting the method of [1] to the context of speech recognition, this paper uses a recursive divide-and-conquer algorithm to reduce the space requirement to logarithmic in the number of frames. With this procedure, it is possible to do exact computations for observation sequences of essentially arbitrary length. The procedure works by manipulating a stack of alpha vectors, and by using sparse vectors, the space savings can be combined with those of traditional pruning techniques. We apply this technique to MAP lattice construction, and present the first results in the literature for that technique. We find that it is an eective way of creating word lattices, and that doing the exact computations enabled by the log-space technique results in lower word error rates than space saving via traditional pruning.
J. Binder, K. Murphy and S. Russell. Space-Efficient Inference in Dynamic Probabilistic Networks. Proc. 15th Int'l. Joint Conf. on AI, 1997.
Cite as: Zweig, G., Padmanabhan, M. (2000) Exact alpha-beta computation in logarithmic space with application to MAP word graph construction. Proc. 6th International Conference on Spoken Language Processing (ICSLP 2000), vol. 2, 855-858, doi: 10.21437/ICSLP.2000-404
@inproceedings{zweig00_icslp, author={Geoffrey Zweig and Mukund Padmanabhan}, title={{Exact alpha-beta computation in logarithmic space with application to MAP word graph construction}}, year=2000, booktitle={Proc. 6th International Conference on Spoken Language Processing (ICSLP 2000)}, pages={vol. 2, 855-858}, doi={10.21437/ICSLP.2000-404} }