Personalized Natural Language Understanding

Xiaohu Liu, Ruhi Sarikaya, Liang Zhao, Yong Ni, Yi-Cheng Pan


Natural language understanding (NLU) is one of the critical components of dialog systems. Its aim is to extract semantic meaning from typed text input or the spoken text coming out of the speech recognizer. Traditionally, NLU systems are built in a user-independent fashion, where the system behavior does not adapt to the user. However, personal information can be very useful for language understanding tasks, if it is made available to the system. With personal digital assistant (PDA) systems, many forms of personal data are readily available for the NLU systems to make the models and the system more personal. In this paper, we propose a method to personalize language understanding models by making use of the personal data with privacy respected and protected. We report experiments on two domains for intent classification and slot tagging, where we achieve significant accuracy improvements compared to the baseline models that are trained in a user independent manner.


DOI: 10.21437/Interspeech.2016-1172

Cite as

Liu, X., Sarikaya, R., Zhao, L., Ni, Y., Pan, Y. (2016) Personalized Natural Language Understanding. Proc. Interspeech 2016, 1146-1150.

Bibtex
@inproceedings{Liu+2016,
author={Xiaohu Liu and Ruhi Sarikaya and Liang Zhao and Yong Ni and Yi-Cheng Pan},
title={Personalized Natural Language Understanding},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-1172},
url={http://dx.doi.org/10.21437/Interspeech.2016-1172},
pages={1146--1150}
}