INTERSPEECH 2008
9th Annual Conference of the International Speech Communication Association

Brisbane, Australia
September 22-26, 2008

Flexible Discriminative Training Based on Equal Error Group Scores Obtained from an Error-Indexed Forward-Backward Algorithm

Erik McDermott, Atsushi Nakamura

NTT Corporation, Japan

This article presents a new approach to discriminative training that uses equal error groups of word strings as the unit of weighted error modeling. The proposed approach, Minimum Group Error (MGE), is based on a novel error-indexed Forward-Backward algorithm that can be used to generate group scores efficiently over standard recognition lattices. The approach offers many possibilities for group occupancy scaling, enabling, for instance, the boosting of error groups with low occupancies. Preliminary experiments examined the new approach using both uniformly and non-uniformly scaled group scores. Results for the new approach evaluated on the Corpus of Spontaneous Japanese (CSJ) lecture speech transcription task were compared with results for standard Minimum Classification Error (MCE), Minimum Phone Error (MPE) and Maximum Mutual Information (MMI), in tandem with I-smoothing. It was found that non-uniform scaling of group scores outperformed MPE when no I-smoothing is used.

Full Paper

Bibliographic reference.  McDermott, Erik / Nakamura, Atsushi (2008): "Flexible discriminative training based on equal error group scores obtained from an error-indexed forward-backward algorithm", In INTERSPEECH-2008, 2398-2401.