International Workshop on Spoken Language Translation (IWSLT) 2010
Neural Network Language Models (NNLMs) have been applied
to Statistical Machine Translation (SMT) outperforming
the translation quality. N-best list rescoring is the most
popular approach to deal with the computational problems
that appear when using huge NNLMs. But the question of
how much improvement could be achieved in a coupled
system remains unanswered. This open question motivated
some previous work of us in order to speed the evaluation
of NNLMs. Now, this work integrates the NNLM evaluation
in the core of the SMT decoder. NNLMs are used in combination
with statistical standard N-gram language models
under the maximum entropy framework in an N-gram-based
SMT system. A reordering decoder builds a reordering graph
coupled during a Viterbi decoding.
This N-gram-based SMT system enhanced with NNLMs for the French-English BTEC task of the IWSLT'10 evaluation campaign is described in detail. An improvement between 1.8 and 2.4 BLEU points was obtained from the baseline system to the official primary system. This system has been positioned as second in the automatic evaluation of the IWSLT'10 official results.
Bibliographic reference. Zamora-Martínez, Francisco / Castro-Bleda, María José / Schwenk, Holger (2010): "N-gram-based machine translation enhanced with neural networks for the French-English BTEC-IWSLT'10 task", In IWSLT-2010, 45-52.