Auditory-Visual Speech Processing (AVSP) 2013

Annecy, France
August 29 - September 1, 2013

The Touch of Your Lips: Haptic Information Speeds Up Auditory Speech Processing

Avril Treille, Camille Cordeboeuf, Coriandre Vilain, Marc Sato

GIPSA-lab, Department of Speech & Cognition, CNRS & Grenoble University, Grenoble, France

The human ability to follow speech gestures through the visual modality is a core component of speech perception. Remarkably, speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker’s face. In the present study, early cross-modal interactions were investigated by comparing early auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in natural dyadic interactions between a listener and a speaker. Although participants were not experienced with audio-haptic speech perception, shortened latencies of auditory evoked potentials were observed in both audio-visual and audio-tactile modalities compared to the auditory modality. These results demonstrate early cross-modal interactions during face-to-face and hand-to-face speech perception and highlight a predictive role of visual and haptic information on auditory speech processing in dyadic interactions.

Index Terms: audio-visual speech perception, audio-haptic speech perception, EEG.

Full Paper

Bibliographic reference.  Treille, Avril / Cordeboeuf, Camille / Vilain, Coriandre / Sato, Marc (2013): "The touch of your lips: haptic information speeds up auditory speech processing", In AVSP-2013, 141-146.