INTERSPEECH 2004 - ICSLP
8th International Conference on Spoken Language Processing

Jeju Island, Korea
October 4-8, 2004

A Study of Minimum Classification Error Training for Segmental Switching Linear Gaussian Hidden Markov Models

Jian Wu, Donglai Zhu, Qiang Huo

The University of Hong Kong, Hong Kong

In our previous works, a Switching Linear Gaussian Hidden Markov Model (SLGHMM) and its segmental derivative, SSLGHMM, were proposed to cast the problem of modelling a noisy speech utterance by a well-designed dynamic Bayesian network. We presented parameter learning procedures for both models with maximum likelihood (ML) criterion. The effectiveness of such models was confirmed by evaluation experiments on Aurora2 database. In this paper, we present a study of minimum classification error (MCE) training for SSLGHMM and discuss its relation to our earlier proposals based on stochastic vector mapping. An important implementation issue of SSLGHMM, namely the specification of switching states for a given utterance, is also studied. New evaluation results on Aurora3 database show that MCE-trained SSLGHMMs achieve a relative error reduction of 21% over a baseline system based on ML-trained continuous density HMMs (CDHMMs).

Full Paper

Bibliographic reference.  Wu, Jian / Zhu, Donglai / Huo, Qiang (2004): "A study of minimum classification error training for segmental switching linear Gaussian hidden Markov models", In INTERSPEECH-2004, 2813-2816.