Auditory-Visual Speech Processing (AVSP) 2013

Annecy, France
August 29 - September 1, 2013

Audiovisual Speech Perception in Children with Autism Spectrum Disorders and Typical Controls

Julia R. Irwin (1,2), Lawrence Brancazio (1,2)

(1) Haskins Laboratories; (2) Southern Connecticut State University;
New Haven, CT, USA

This paper presents data comparing children with autism spectrum disorders (ASD) to those with typical development (TD) on auditory, visual and audiovisual speech perception. Using eye tracking methodology, we assessed group differences in visual influence on heard speech and pattern of gaze to speaking faces. There were no differences in perception of auditory syllables /ma/ and /na/ in clear listening conditions or in the presence of noise. In addition, there were no differences in perception of a non-speech, non-face control. However, children with ASD were significantly less visually influenced than TD controls in mismatched AV and speech reading conditions, and showed less visual gain (AV speech in the presence of auditory noise). Further, to examine whether differential patterns of gaze may underlie these findings, we examined participant gaze to the speaking faces. The children with ASD looked significantly less to the face of the speaker overall. When children with ASD looked at a speaker’s face, they looked less at the mouth of the speaker and more to non-focal areas of the face during the speech reading and AV speech in noise conditions. No group differences were observed for pattern of gaze to non-face, nonspeech controls.

Index Terms: audiovisual speech perception, autism spectrum disorders, eye tracking.

Full Paper

Bibliographic reference.  Irwin, Julia R. / Brancazio, Lawrence (2013): "Audiovisual speech perception in children with autism spectrum disorders and typical controls", In AVSP-2013, 65-70.