We adopt an unsupervised concept-based global optimization framework for extractive meeting summarization, where a subset of sentences is selected to cover as many important concepts as possible. We propose to leverage sentence importance weights in this model. Three ways are introduced to combine the sentence weights within the concept-based optimization framework: selecting sentences for concept extraction, pruning unlikely candidate summary sentences, and using joint optimization of sentence and concept weights. Our experimental results on the ICSI meeting corpus show that our proposed methods can significantly improve the performance for both human transcripts and ASR output compared to the concept-based baseline approach, and this unsupervised approach achieves results comparable with those from supervised learning approaches presented in previous work.
Bibliographic reference. Xie, Shasha / Favre, Benoit / Hakkani-Tür, Dilek / Liu, Yang (2009): "Leveraging sentence weights in a concept-based optimization framework for extractive meeting summarization", In INTERSPEECH-2009, 1503-1506.