ISCA Archive ICSLP 2000
ISCA Archive ICSLP 2000

Improved performance and generalization of minimum classification error training for continuous speech recognition

Darryl W. Purnell, Elizabeth C. Botha

Discriminative training of hidden Markov models (HMMs) using segmental minimum classification error (MCE) training has been shown to work extremely well for certain speech recognition applications. It is, however, somewhat prone to overspecialization. This study investigates various techniques which improve performance and generalization of the MCE algorithm. Improvements of up to 7% in relative error rate on the test set are achieved.

Keywords: speech recognition, discriminative training, minimum classification error, overspecialization, overtraining


Cite as: Purnell, D.W., Botha, E.C. (2000) Improved performance and generalization of minimum classification error training for continuous speech recognition. Proc. 6th International Conference on Spoken Language Processing (ICSLP 2000), vol. 4, 165-168

@inproceedings{purnell00_icslp,
  author={Darryl W. Purnell and Elizabeth C. Botha},
  title={{Improved performance and generalization of minimum classification error training for continuous speech recognition}},
  year=2000,
  booktitle={Proc. 6th International Conference on Spoken Language Processing (ICSLP 2000)},
  pages={vol. 4, 165-168}
}