Self Attention in Variational Sequential Learning for Summarization

Jen-Tzung Chien, Chun-Wei Wang


Attention mechanism plays a crucial role in sequential learning for many speech and language applications. However, it is challenging to develop a stochastic attention in a sequence-to-sequence model which consists of two recurrent neural networks (RNNs) as the encoder and decoder. The problem of posterior collapse happens in variational inference and results in the estimated latent variables close to a standard Gaussian prior so that the information from input sequence is disregarded in learning process. This paper presents a new recurrent autoencoder for sentence representation where a self attention scheme is incorporated to activate the interaction between inference and generation in training procedure. In particular, a stochastic RNN decoder is implemented to provide additional latent variable to fulfill self attention for sentence reconstruction. The posterior collapse is alleviated. The latent information is sufficiently attended in variational sequential learning. During test phase, the estimated prior distribution of decoder is sampled for stochastic attention and generation. Experiments on Penn Treebank and Yelp 2013 show the desirable generation performance in terms of perplexity. The visualization of attention weights also illustrates the usefulness of self attention. The evaluation on DUC 2007 demonstrates the merit of variational recurrent autoencoder for document summarization.


 DOI: 10.21437/Interspeech.2019-1548

Cite as: Chien, J., Wang, C. (2019) Self Attention in Variational Sequential Learning for Summarization. Proc. Interspeech 2019, 1318-1322, DOI: 10.21437/Interspeech.2019-1548.


@inproceedings{Chien2019,
  author={Jen-Tzung Chien and Chun-Wei Wang},
  title={{Self Attention in Variational Sequential Learning for Summarization}},
  year=2019,
  booktitle={Proc. Interspeech 2019},
  pages={1318--1322},
  doi={10.21437/Interspeech.2019-1548},
  url={http://dx.doi.org/10.21437/Interspeech.2019-1548}
}