Third International Conference on Spoken Language Processing (ICSLP 94)
In this paper, we describe the use of recurrent Multilayer Perceptrons (MLP's) as state probability estimators for word models. The advantage over conventional word Hidden Markov Models (HMM's) is the ease of discriminative training of the models. We find that a minimal state duration is useful. The results on a telephone quality, speaker independent digit recognition task compare favorably with the results of the approach presented by us earlier this year (Le Cerf et al ).
Bibliographic reference. Cerf, Philippe Le / Compernolle, Dirk Van (1994): "Recurrent neural network word models for small vocabulary speech recognition", In ICSLP-1994, 1547-1550.