ISCA Archive Interspeech 2013
ISCA Archive Interspeech 2013

Hierarchical pitman-yor and dirichlet process for language model

Jen-Tzung Chien, Ying-Lan Chang

This paper presents a nonparametric interpretation for modern language model based on the hierarchical Pitman-Yor and Dirichlet (HPYD) process. We propose the HPYD language model (HPYD-LM) which flexibly conducts backoff smoothing and topic clustering through Bayesian nonparametric learning. The nonparametric priors of backoff n-grams and latent topics are tightly coupled in a compound process. A hybrid probability measure is drawn to build the smoothed topic-based LM. The model structure is automatically determined from training data. A new Chinese restaurant scenario is proposed to implement HPYD-LM via Gibbs sampling. This process reflects the power-law property and extracts the semantic topics from natural language. The superiority of HPYD-LM to the related LMs is demonstrated by the experiments on different corpora in terms of perplexity and word error rate.


doi: 10.21437/Interspeech.2013-521

Cite as: Chien, J.-T., Chang, Y.-L. (2013) Hierarchical pitman-yor and dirichlet process for language model. Proc. Interspeech 2013, 2212-2216, doi: 10.21437/Interspeech.2013-521

@inproceedings{chien13_interspeech,
  author={Jen-Tzung Chien and Ying-Lan Chang},
  title={{Hierarchical pitman-yor and dirichlet process for language model}},
  year=2013,
  booktitle={Proc. Interspeech 2013},
  pages={2212--2216},
  doi={10.21437/Interspeech.2013-521}
}