ISCA Archive Interspeech 2006
ISCA Archive Interspeech 2006

Minimum divergence based discriminative training

Jun Du, Peng Liu, Frank K. Soong, Jian-Lai Zhou, Ren-Hua Wang

We propose to use Minimum Divergence (MD) as a new measure of errors in discriminative training. To focus on improving discrimination between any two given acoustic models, we refine the error definition in terms of Kullback-Leibler Divergence (KLD) between them. The new measure can be regarded as a modified version of Minimum Phone Error (MPE) but with a higher resolution than just a symbol matching based criterion. Experimental recognition results show the new MD based training yields relative word error rate reductions of 57.8% and 6.1% on TIDigits and Switchboard databases, respectively, in comparing with the ML trained baseline systems. The recognition performance of MD is also shown to be consistently better than that of MPE.


doi: 10.21437/Interspeech.2006-604

Cite as: Du, J., Liu, P., Soong, F.K., Zhou, J.-L., Wang, R.-H. (2006) Minimum divergence based discriminative training. Proc. Interspeech 2006, paper 1703-Thu2A1O.2, doi: 10.21437/Interspeech.2006-604

@inproceedings{du06_interspeech,
  author={Jun Du and Peng Liu and Frank K. Soong and Jian-Lai Zhou and Ren-Hua Wang},
  title={{Minimum divergence based discriminative training}},
  year=2006,
  booktitle={Proc. Interspeech 2006},
  pages={paper 1703-Thu2A1O.2},
  doi={10.21437/Interspeech.2006-604}
}