Interspeech'2005 - Eurospeech
The Hidden Vector State (HVS) model extends the basic Hidden Markov Model (HMM) by encoding each state as a vector of stack states but with restricted stack operations. The model uses a right branching stack automaton to assign valid stochastic parses to a word sequence from which the language model probability can be estimated. The model is completely data driven and is able to model classes from the data that reflect the hierarchical structures found in natural language. This paper describes the design and the implementation of the HVS language model , focusing on the practical issues of initialisation and training using Baum-Welch re-estimation whilst accommodating a large and dynamic state space. Results of experiments conducted using the ATIS corpus  show that the HVS language model reduces test set perplexity compared to standard class based language models.
Bibliographic reference. Seneviratne, Vidura / Young, Steve (2005): "The hidden vector state language model", In INTERSPEECH-2005, 9-12.