EUROSPEECH 2001 Scandinavia
7th European Conference on Speech Communication and Technology
2nd INTERSPEECH Event

Aalborg, Denmark
September 3-7, 2001

                 

Improved Maximum Mutual Information Estimation Training of Continuous Density HMMs

Jing Zheng, John Butzberger, Horacio Franco, Andreas Stolcke

SRI International, USA

In maximum mutual information estimation (MMIE) training, the currently widely used update equations derive from the Extended Baum-Welch (EBW) algorithm, which was originally designed for the discrete hidden Markov model (HMM) and was extended to continuous Gaussian density HMMs through approximations. We derive a new set of equations for MMIE based on a quasi-Newton algorithm, without relying on EBW. We find that by adopting a generalized form of the MMIE criterion, the H-criterion, convergence speed and recognition performance can be improved. The proposed approach has been applied to a spelled-word recognition task leading to a 21.6% relative letter error rate reduction with respect to the standard Maximum Likelihood Estimation (MLE) training method, and showing advantages over the conventional MMIE approach in terms of both training speed and recognition accuracy.

Full Paper

Bibliographic reference.  Zheng, Jing / Butzberger, John / Franco, Horacio / Stolcke, Andreas (2001): "Improved maximum mutual information estimation training of continuous density HMMs", In EUROSPEECH-2001, 679-682.