Latent Topic Attention for Domain Classification

Peisong Huang, Peijie Huang, Wencheng Ai, Jiande Ding, Jinchuan Zhang


Attention-based bidirectional long short-term network (BiLSTM) models have recently shown promising results in text classification tasks. However, when the amount of training data is restricted, or the distribution of the test data is quite different from the training data, some potential informative words maybe hard to be captured in training. In this work, we propose a new method to learn attention mechanism for domain classification. Unlike the past attention mechanisms only guided by domain tags of training data, we explore using the latent topics in the data set to learn topic attention, and employ it for BiLSTM. Experiments on the SMP-ECDT benchmark corpus show that the proposed latent topic attention mechanism outperforms the state-of-the-art soft and hard attention mechanisms in domain classification. Moreover, experiment result shows that the proposed method can be trained with additional unlabeled data and further improve the domain classification performance.


 DOI: 10.21437/Interspeech.2019-2228

Cite as: Huang, P., Huang, P., Ai, W., Ding, J., Zhang, J. (2019) Latent Topic Attention for Domain Classification. Proc. Interspeech 2019, 1338-1342, DOI: 10.21437/Interspeech.2019-2228.


@inproceedings{Huang2019,
  author={Peisong Huang and Peijie Huang and Wencheng Ai and Jiande Ding and Jinchuan Zhang},
  title={{Latent Topic Attention for Domain Classification}},
  year=2019,
  booktitle={Proc. Interspeech 2019},
  pages={1338--1342},
  doi={10.21437/Interspeech.2019-2228},
  url={http://dx.doi.org/10.21437/Interspeech.2019-2228}
}