ISCA Archive IberSPEECH 2022
ISCA Archive IberSPEECH 2022

An Attentional Extractive Summarization Framework

José Ángel González, Encarna Segarra, Fernando García-Granada, Emilio Sanchis, Lluis-F Hurtado

Although currently, works on text summarization generally use abstractive approaches, extractive methods can be specially adequate for some applications, and they can help with other tasks such as Question Answering or Information Extraction. In this paper, we propose a general framework for extractive summarization, the Attentional Extractive Summarization framework. The proposed approach is based on the interpretation of the attention mechanisms of hierarchical neural networks, that compute document-level representations of documents and summaries from sentence-level representations, which, in turn, are computed from word-level representations. The models proposed under this framework are able to automatically learn relationships among document and summary sentences, without requiring oracle systems to compute reference labels for each sentence before the training phase. We evaluate two different systems, formalized under the proposed framework, on the CNN/DailyMail and the NewsRoom corpora, which are some of the reference corpora in the most relevant works in text summarization. The results obtained during the evaluation support the adequacy of our proposal and they suggest that there is still room for the improvement of our attentional framework.


doi: 10.21437/IberSPEECH.2022-22

Cite as: González, J.Á., Segarra, E., García-Granada, F., Sanchis, E., Hurtado, L.-F. (2022) An Attentional Extractive Summarization Framework . Proc. IberSPEECH 2022, 106-110, doi: 10.21437/IberSPEECH.2022-22

@inproceedings{gonzalez22_iberspeech,
  author={José Ángel González and Encarna Segarra and Fernando García-Granada and Emilio Sanchis and Lluis-F Hurtado},
  title={{An Attentional Extractive Summarization Framework }},
  year=2022,
  booktitle={Proc. IberSPEECH 2022},
  pages={106--110},
  doi={10.21437/IberSPEECH.2022-22}
}