INTERSPEECH 2013
14thAnnual Conference of the International Speech Communication Association

Lyon, France
August 25-29, 2013

Multilingual Multilayer Perceptron for Rapid Language Adaptation Between and Across Language Families

Ngoc Thang Vu, Tanja Schultz

KIT, Germany

In this paper, we present our latest investigations of multilingual Multilayer Perceptrons (MLPs) for rapid language adaptation between and across language families. We explore the impact of the amount of languages and data used for the multilingual MLP training process. We show that the overall system performance on the target language is significantly improved by initializing it with a multilingual MLP. Our experiments indicate that the more languages we use to train a multilingual MLP, the better is the initialization for MLP training. As a result, the ASR performance is improved, even if the target language and the source languages are not in the same language family. Our best results show an error rate improvement of up to 22.9% relative for different target languages (Czech, Hausa and Vietnamese) by using a multilingual MLP which has been trained with many different languages from the GlobalPhone corpus. In the case of very few training or adaptation data, an improvement of up to 24% relative in terms of error rate is observed.

Full Paper

Bibliographic reference.  Vu, Ngoc Thang / Schultz, Tanja (2013): "Multilingual multilayer perceptron for rapid language adaptation between and across language families", In INTERSPEECH-2013, 515-519.