Virtual Adversarial Training Applied to Neural Higher-Order Factors for Phone Classification

Martin Ratajczak, Sebastian Tschiatschek, Franz Pernkopf


We explore virtual adversarial training (VAT) applied to neural higher-order conditional random fields for sequence labeling. VAT is a recently introduced regularization method promoting local distributional smoothness: It counteracts the problem that predictions of many state-of-the-art classifiers are unstable to adversarial perturbations. Unlike random noise, adversarial perturbations are minimal and bounded perturbations that flip the predicted label. We utilize VAT to regularize neural higher-order factors in conditional random fields. These factors are for example important for phone classification where phone representations strongly depend on the context phones. However, without using VAT for regularization, the use of such factors was limited as they were prone to overfitting. In extensive experiments, we successfully apply VAT to improve performance on the TIMIT phone classification task. In particular, we achieve a phone error rate of 13.0%, exceeding the state-of-the-art performance by a wide margin.


DOI: 10.21437/Interspeech.2016-832

Cite as

Ratajczak, M., Tschiatschek, S., Pernkopf, F. (2016) Virtual Adversarial Training Applied to Neural Higher-Order Factors for Phone Classification. Proc. Interspeech 2016, 2756-2760.

Bibtex
@inproceedings{Ratajczak+2016,
author={Martin Ratajczak and Sebastian Tschiatschek and Franz Pernkopf},
title={Virtual Adversarial Training Applied to Neural Higher-Order Factors for Phone Classification},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-832},
url={http://dx.doi.org/10.21437/Interspeech.2016-832},
pages={2756--2760}
}