This paper presents an animated 2D articulation model of the tongue and the lips for biofeedback applications. The model is controlled by real-time optopalatographic measurements of the positions of the upper lip and the tongue in the anterior oral cavity. The measurement system is an improvement on a previous prototype with increased spatial resolution and an enhanced close-range behavior. The posterior part of the tongue was added to the model by linear prediction. The prediction coefficients were determined and evaluated using a corpus of vocal tract traces of 25 sustained phonemes. The model represents the tongue motion and the lip opening physiologically plausible during articulation in real-time.
Bibliographic reference. Preuß, Simon / Neuschaefer-Rube, Christiane / Birkholz, Peter (2013): "Real-time control of a 2d animation model of the vocal tract using optopalatography", In INTERSPEECH-2013, 997-1001.