EUROSPEECH 2003 - INTERSPEECH 2003
8th European Conference on Speech Communication and Technology

Geneva, Switzerland
September 1-4, 2003

        

Pruning Transitions in a Hidden Markov Model with Optimal Brain Surgeon

Brian Mak, Kin-Wah Chan

Hong Kong University of Science & Technology, China

This paper concerns about reducing the topology of a hidden Markov model (HMM) for a given task. The purpose is two-fold: (1) to select a good model topology with improved generalization capability; and/or (2) to reduce the model complexity so as to save memory and computation costs. The first goal falls into the active research area of model selection. From the model-theoretic research community, various measures such as Bayesian information criterion, minimum description length, minimum message length have been proposed and used with some success. In this paper, we are considering another approach in which a well-performed HMM, though perhaps oversized, is optimally pruned so that the loss in the model training cost function is minimal. The method is known as Optimal Brain Surgeon (OBS) that has been used in the neural network (NN) community. The application of OBS to NN is a constrained optimization problem; its application to HMM is more involved and it becomes a quadratic programming problem with both equality and inequality constraints. The detailed formulation is presented, and the algorithm is shown effective by an example in which HMM state transitions are pruned. The reduced model also results in better generalization performance on unseen test data.

Full Paper

Bibliographic reference.  Mak, Brian / Chan, Kin-Wah (2003): "Pruning transitions in a hidden Markov model with optimal brain surgeon", In EUROSPEECH-2003, 2521-2524.