Visual speech information, such as a speaker’s mouth and eyebrow movements, enhances speech perception. Evidence for this perceptual benefit has mainly been from behavioural or neurophysiological studies that made use of event-related potentials (ERPs). ERP studies, however, are limited by repetitive and short stimuli that are not representative of natural speech. An approach that examines cortical tracking of the speech envelope allows for the use of continuous speech stimuli. This approach has recently been employed to demonstrate that adults’ cortical tracking of the speech envelope is augmented when synchronous visual speech information is provided [1]. To date, no study has investigated whether children, like adults, show stronger envelope tracking when congruent visual speech information is available. This study investigates this question by measuring four-year-olds’ cortical tracking of continuous auditory-visual speech through electroencephalography (EEG). Cortical tracking was quantified by means of ridge regression models that estimate the linear mapping from the speech to the EEG signal and vice versa. Stimulus reconstruction for auditory-only and auditoryvisual speech was found to be stronger compared to visual-only speech.
Cite as: Tan, S.H.J., Crosse, M.J., Di Liberto, G.M., Burnham, D. (2019) Four-Year-Olds’ Cortical Tracking to Continuous Auditory-Visual Speech. Proc. The 15th International Conference on Auditory-Visual Speech Processing, 53-56, doi: 10.21437/AVSP.2019-11
@inproceedings{tan19b_avsp, author={S. H. Jessica Tan and Michael J. Crosse and Giovanni M. {Di Liberto} and Denis Burnham}, title={{Four-Year-Olds’ Cortical Tracking to Continuous Auditory-Visual Speech}}, year=2019, booktitle={Proc. The 15th International Conference on Auditory-Visual Speech Processing}, pages={53--56}, doi={10.21437/AVSP.2019-11} }