FACTS: A Hierarchical Task-based Control Model of Speech Incorporating Sensory Feedback

Benjamin Parrell, Vikram Ramanarayanan, Srikantan Nagarajan, John Houde


We present a computational model of speech motor control that integrates vocal tract state prediction with sensory feedback. This hierarchical model, called FACTS, incorporates both a high-level and low-level controller. The high-level controller orchestrates linguistically-relevant speech tasks, which are represented as desired constrictions along the vocal tract (e.g., closure of the lips). The output of the high-level controller is passed to a low-level controller that can issue motor commands at the level of the speech articulators in order to accomplish the desired constrictions. In order to generate these articulatory motor commands, this low-level articulatory controller relies on an estimate of the current state of the vocal tract. This estimate combines internal predictions about the consequences of issued motor commands with auditory and somatosensory feedback from the vocal tract using an Unscented Kalman Filter based state estimation method. FACTS is able to replicate important aspects of human speech behavior, in that it reproduces: (i) stable speech behavior in the presence of noisy motor and sensory systems, (ii) partial acoustic compensation to auditory feedback perturbations, (iii) complete compensations to mechanical perturbations only when they interfere with current production goals and (iv) the observed relationship between sensory acuity and response to sensory perturbations.


 DOI: 10.21437/Interspeech.2018-2087

Cite as: Parrell, B., Ramanarayanan, V., Nagarajan, S., Houde, J. (2018) FACTS: A Hierarchical Task-based Control Model of Speech Incorporating Sensory Feedback. Proc. Interspeech 2018, 1497-1501, DOI: 10.21437/Interspeech.2018-2087.


@inproceedings{Parrell2018,
  author={Benjamin Parrell and Vikram Ramanarayanan and Srikantan Nagarajan and John Houde},
  title={FACTS: A Hierarchical Task-based Control Model of Speech Incorporating Sensory Feedback},
  year=2018,
  booktitle={Proc. Interspeech 2018},
  pages={1497--1501},
  doi={10.21437/Interspeech.2018-2087},
  url={http://dx.doi.org/10.21437/Interspeech.2018-2087}
}