ISCA Archive Interspeech 2008
ISCA Archive Interspeech 2008

Can visualization of internal articulators support speech perception?

Preben Wik, Olov Engwall

This paper describes the contribution to speech perception given by animations of intra-oral articulations. 18 subjects were asked to identify the words in acoustically degraded sentences in three different presentation modes: acoustic signal only, audiovisual with a front view of a synthetic face and an audiovisual with both front face view and a side view, where tongue movements were visible by making parts of the cheek transparent. The augmented reality side-view did not help subjects perform better overall than with the front view only, but it seems to have been beneficial for the perception of palatal plosives, liquids and rhotics, especially in clusters. The results indicate that it cannot be expected that intra-oral animations support speech perception in general, but that information on some articulatory features can be extracted. Animations of tongue movements have hence more potential for use in computer-assisted pronunciation and perception training than as a communication aid for the hearing-impaired.


doi: 10.21437/Interspeech.2008-651

Cite as: Wik, P., Engwall, O. (2008) Can visualization of internal articulators support speech perception? Proc. Interspeech 2008, 2627-2630, doi: 10.21437/Interspeech.2008-651

@inproceedings{wik08_interspeech,
  author={Preben Wik and Olov Engwall},
  title={{Can visualization of internal articulators support speech perception?}},
  year=2008,
  booktitle={Proc. Interspeech 2008},
  pages={2627--2630},
  doi={10.21437/Interspeech.2008-651}
}