ISCA Archive Interspeech 2021
ISCA Archive Interspeech 2021

Enhancing Semantic Understanding with Self-Supervised Methods for Abstractive Dialogue Summarization

Hyunjae Lee, Jaewoong Yun, Hyunjin Choi, Seongho Joe, Youngjune L. Gwon

Contextualized word embeddings can lead to state-of-the-art performances in natural language understanding. Recently, a pre-trained deep contextualized text encoder such as BERT has shown its potential in improving natural language tasks including abstractive summarization. Existing approaches in dialogue summarization focus on incorporating a large language model into summarization task trained on large-scale corpora consisting of news articles rather than dialogues of multiple speakers. In this paper, we introduce self-supervised methods to compensate shortcomings to train a dialogue summarization model. Our principle is to detect incoherent information flows using pretext dialogue text to enhance BERT’s ability to contextualize the dialogue text representations. We build and fine-tune an abstractive dialogue summarization model on a shared encoder-decoder architecture using the enhanced BERT. We empirically evaluate our abstractive dialogue summarizer with the SAMSum corpus, a recently introduced dataset with abstractive dialogue summaries. All of our methods have contributed improvements to abstractive summary measured in ROUGE scores. Through an extensive ablation study, we also present a sensitivity analysis to critical model hyperparameters, probabilities of switching utterances and masking interlocutors.


doi: 10.21437/Interspeech.2021-1270

Cite as: Lee, H., Yun, J., Choi, H., Joe, S., Gwon, Y.L. (2021) Enhancing Semantic Understanding with Self-Supervised Methods for Abstractive Dialogue Summarization. Proc. Interspeech 2021, 796-800, doi: 10.21437/Interspeech.2021-1270

@inproceedings{lee21_interspeech,
  author={Hyunjae Lee and Jaewoong Yun and Hyunjin Choi and Seongho Joe and Youngjune L. Gwon},
  title={{Enhancing Semantic Understanding with Self-Supervised Methods for Abstractive Dialogue Summarization}},
  year=2021,
  booktitle={Proc. Interspeech 2021},
  pages={796--800},
  doi={10.21437/Interspeech.2021-1270}
}