ISCA Archive Interspeech 2008
ISCA Archive Interspeech 2008

Penalty function maximization for large margin HMM training

George Saon, Daniel Povey

We perform large margin training of HMM acoustic parameters by maximizing a penalty function which combines two terms. The first term is a scale which gets multiplied with the Hamming distance between HMM state sequences to form a multi-label (or sequence) margin. The second term arises from constraints on the training data that the joint log-likelihoods of acoustic and correct word sequences exceed the joint log-likelihoods of acoustic and incorrect word sequences by at least the multi-label margin between the corresponding Viterbi state sequences. Using the soft-max trick, we collapse these constraints into a boosted MMI-like term. The resulting objective function can be efficiently maximized using extended Baum-Welch updates. Experimental results on multiple LVCSR tasks show a good correlation between the objective function and the word error rate.


doi: 10.21437/Interspeech.2008-108

Cite as: Saon, G., Povey, D. (2008) Penalty function maximization for large margin HMM training. Proc. Interspeech 2008, 920-923, doi: 10.21437/Interspeech.2008-108

@inproceedings{saon08_interspeech,
  author={George Saon and Daniel Povey},
  title={{Penalty function maximization for large margin HMM training}},
  year=2008,
  booktitle={Proc. Interspeech 2008},
  pages={920--923},
  doi={10.21437/Interspeech.2008-108}
}