Neural Transition Systems for Modeling Hierarchical Semantic Representations

Riyaz Bhat, John Chen, Rashmi Prasad, Srinivas Bangalore


While virtual agents are becoming ubiquitous in our daily life, their functionality is limited to simple commands which involve a single intent and an unstructured set of entities. Typically, in such systems, the natural language understanding (NLU) component uses a sequence tagging model to extract a flat meaning representation. However, in order to support complex user requests with multiple intents with their associated entities, such as those in a product ordering domain, a structured semantic representation is necessary. In this paper, we present hierarchical semantic representations for product ordering in the food services domain and two NLU models that produce such representations efficiently using deep neural networks. The models are based on transition-based algorithms which have been proven to be effective and scalable for multiple NLP tasks such as syntactic parsing and slot filling. The first model uses a multitasking architecture containing multiple transition systems with tree constraints to model the hierarchical annotations, while the second model treats the task as a constituency parsing problem by mapping the target domain annotations to a constituency tree. We demonstrate that both multi-task and constituency-based transition systems achieve competitive results and even show improvements over sequential models, showing their effectiveness in modeling hierarchical structure.


 DOI: 10.21437/Interspeech.2019-3075

Cite as: Bhat, R., Chen, J., Prasad, R., Bangalore, S. (2019) Neural Transition Systems for Modeling Hierarchical Semantic Representations. Proc. Interspeech 2019, 1173-1177, DOI: 10.21437/Interspeech.2019-3075.


@inproceedings{Bhat2019,
  author={Riyaz Bhat and John Chen and Rashmi Prasad and Srinivas Bangalore},
  title={{Neural Transition Systems for Modeling Hierarchical Semantic Representations}},
  year=2019,
  booktitle={Proc. Interspeech 2019},
  pages={1173--1177},
  doi={10.21437/Interspeech.2019-3075},
  url={http://dx.doi.org/10.21437/Interspeech.2019-3075}
}