ISCA Archive Interspeech 2017
ISCA Archive Interspeech 2017

Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks

Mingbo Ma, Kai Zhao, Liang Huang, Bing Xiang, Bowen Zhou

Sentence-level classification and sequential labeling are two fundamental tasks in language understanding. While these two tasks are usually modeled separately, in reality, they are often correlated, for example in intent classification and slot filling, or in topic classification and named-entity recognition. In order to utilize the potential benefits from their correlations, we propose a jointly trained model for learning the two tasks simultaneously via Long Short-Term Memory (LSTM) networks. This model predicts the sentence-level category and the word-level label sequence from the stepwise output hidden representations of LSTM. We also introduce a novel mechanism of “sparse attention” to weigh words differently based on their semantic relevance to sentence-level classification. The proposed method outperforms baseline models on ATIS and TREC datasets.


doi: 10.21437/Interspeech.2017-1321

Cite as: Ma, M., Zhao, K., Huang, L., Xiang, B., Zhou, B. (2017) Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks. Proc. Interspeech 2017, 3334-3338, doi: 10.21437/Interspeech.2017-1321

@inproceedings{ma17e_interspeech,
  author={Mingbo Ma and Kai Zhao and Liang Huang and Bing Xiang and Bowen Zhou},
  title={{Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks}},
  year=2017,
  booktitle={Proc. Interspeech 2017},
  pages={3334--3338},
  doi={10.21437/Interspeech.2017-1321}
}