Slot Filling with Delexicalized Sentence Generation

Youhyun Shin, Kang Min Yoo, Sang-goo Lee


We introduce a novel approach that jointly learns slot filling and delexicalized sentence generation. There have been recent attempts to tackle slot filling as a type of sequence labeling problem, with encoder-decoder attention framework. We further improve the framework by training the model to generate delexicalized sentences, in which words according to slot values are replaced with slot labels. Slot filling with delexicalization shows better results compared to models having a single learning objective of filling slots. The proposed method achieves state-of-the-art slot filling performance on ATIS dataset. We experiment different variants of our model and find that delexicalization encourages generalization by sharing weights among the words with same labels and helps the model to further leverage certain linguistic features.


 DOI: 10.21437/Interspeech.2018-1808

Cite as: Shin, Y., Yoo, K.M., Lee, S. (2018) Slot Filling with Delexicalized Sentence Generation. Proc. Interspeech 2018, 2082-2086, DOI: 10.21437/Interspeech.2018-1808.


@inproceedings{Shin2018,
  author={Youhyun Shin and Kang Min Yoo and Sang-goo Lee},
  title={Slot Filling with Delexicalized Sentence Generation},
  year=2018,
  booktitle={Proc. Interspeech 2018},
  pages={2082--2086},
  doi={10.21437/Interspeech.2018-1808},
  url={http://dx.doi.org/10.21437/Interspeech.2018-1808}
}