ISCA Archive Interspeech 2021
ISCA Archive Interspeech 2021

A Context-Aware Hierarchical BERT Fusion Network for Multi-Turn Dialog Act Detection

Ting-Wei Wu, Ruolin Su, Biing-Hwang Juang

The success of interactive dialog systems is usually associated with the quality of the spoken language understanding (SLU) task, which mainly identifies the corresponding dialog acts and slot values in each turn. By treating utterances in isolation, most SLU systems often overlook the semantic context in which a dialog act is expected. The act dependency between turns is nontrivial and yet critical to the identification of the correct semantic representations. Previous works with limited context awareness have exposed the inadequacy of dealing with complexity in multiproned user intents, which are subject to spontaneous change during turn transitions. In this work, we propose to enhance SLU in multi-turn dialogs, employing a context-aware hierarchical BERT fusion Network (CaBERT-SLU) to not only discern context information within a dialog but also jointly identify multiple dialog acts and slots in each utterance. Experimental results show that our approach reaches new state-of-the-art (SOTA) performances in two complicated multi-turn dialogue datasets with considerable improvements compared with previous methods, which only consider single utterances for multiple intents and slot filling.


doi: 10.21437/Interspeech.2021-95

Cite as: Wu, T.-W., Su, R., Juang, B.-H. (2021) A Context-Aware Hierarchical BERT Fusion Network for Multi-Turn Dialog Act Detection. Proc. Interspeech 2021, 1239-1243, doi: 10.21437/Interspeech.2021-95

@inproceedings{wu21d_interspeech,
  author={Ting-Wei Wu and Ruolin Su and Biing-Hwang Juang},
  title={{A Context-Aware Hierarchical BERT Fusion Network for Multi-Turn Dialog Act Detection}},
  year=2021,
  booktitle={Proc. Interspeech 2021},
  pages={1239--1243},
  doi={10.21437/Interspeech.2021-95}
}