Event-Related Potentials Associated with Somatosensory Effect in Audio-Visual Speech Perception

Takayuki Ito, Hiroki Ohashi, Eva Montas, Vincent L. Gracco


Speech perception often involves multisensory processing. Although previous studies have demonstrated visual [1, 2] and somatosensory interactions [3, 4] with auditory processing, it is not clear whether somatosensory information can contribute to the processing of audio-visual speech perception. This study explored the neural consequence of somatosensory interactions in audio-visual speech processing. We assessed whether somatosensory orofacial stimulation influenced event-related potentials (ERPs) in response to an audio-visual speech illusion (the McGurk Effect [1]). 64 scalp sites of ERPs were recorded in response to audio-visual speech stimulation and somatosensory stimulation. In the audio-visual condition, an auditory stimulus /ba/ was synchronized with the video of congruent facial motion (the production of /ba/) or incongruent facial motion (the production of the /da/: McGurk condition). These two audio-visual stimulations were randomly presented with and without somatosensory stimulation associated with facial skin deformation. We found ERPs differences associated with the McGurk effect in the presence of the somatosensory conditions. ERPs for the McGurk effect reliably diverge around 280 ms after auditory onset. The results demonstrate a change of cortical potential of audio-visual processing due to somatosensory inputs and suggest that somatosensory information encoding facial motion also influences speech processing.


 DOI: 10.21437/Interspeech.2017-139

Cite as: Ito, T., Ohashi, H., Montas, E., Gracco, V.L. (2017) Event-Related Potentials Associated with Somatosensory Effect in Audio-Visual Speech Perception. Proc. Interspeech 2017, 669-673, DOI: 10.21437/Interspeech.2017-139.


@inproceedings{Ito2017,
  author={Takayuki Ito and Hiroki Ohashi and Eva Montas and Vincent L. Gracco},
  title={Event-Related Potentials Associated with Somatosensory Effect in Audio-Visual Speech Perception},
  year=2017,
  booktitle={Proc. Interspeech 2017},
  pages={669--673},
  doi={10.21437/Interspeech.2017-139},
  url={http://dx.doi.org/10.21437/Interspeech.2017-139}
}