ISCA Archive Interspeech 2009
ISCA Archive Interspeech 2009

Are real tongue movements easier to speech read than synthesized?

Olov Engwall, Preben Wik

Speech perception studies with augmented reality displays in talking heads have shown that tongue reading abilities are weak initially, but that subjects become able to extract some information from intra-oral visualizations after a short training session. In this study, we investigate how the nature of the tongue movements influences the results, by comparing synthetic rule-based and actual, measured movements. The subjects were significantly better at perceiving sentences accompanied by real movements, indicating that the current coarticulation model developed for facial movements is not optimal for the tongue.


doi: 10.21437/Interspeech.2009-63

Cite as: Engwall, O., Wik, P. (2009) Are real tongue movements easier to speech read than synthesized? Proc. Interspeech 2009, 824-827, doi: 10.21437/Interspeech.2009-63

@inproceedings{engwall09_interspeech,
  author={Olov Engwall and Preben Wik},
  title={{Are real tongue movements easier to speech read than synthesized?}},
  year=2009,
  booktitle={Proc. Interspeech 2009},
  pages={824--827},
  doi={10.21437/Interspeech.2009-63}
}