FAAVSP - The 1st Joint Conference on
Facial Analysis, Animation, and
The impact of language impairment on audio-visual integration of speech in noise is examined here by testing the influence of the degradation of the auditory and the visual speech cue. Fourteen children with specific language impairment (SLI) and 14 age-matched children with typical language development (TLD) had to identify /aCa/ syllables presented in auditory only (AO), visual only (VO) and audiovisual (AV) congruent and incongruent (McGurk stimuli) conditions, embedded either in stationary noise (ST) or amplitude modulated noise (AM), in a masking release paradigm. Visual cues were either reduced (VR) or clear (VCL). In the AO modality, children with SLI had poorer performance than TLD children in AM noise but not in ST noise, leading to a weaker masking release effect. In VO modality, children with SLI had weaker performance both in VCL and VR conditions. Analyses revealed reduced AV gains in children with SLI compared to control children. In the McGurk trials, SLI children showed a decreased influence of visual cues on AV perception in the SLI group compared to the TLD group. Data analysis in the framework of the Fuzzy- Logical Model of Perception suggested that children with SLI had preserved integration abilities; the differences with TLD children were rather due to differences in the unisensory modalities. An increased weight of audition in the VR condition compared to the VCL condition was observed in both groups, suggesting that participants awarded more weight to audition when the visual input was degraded. Index Terms: multisensory speech perception, spe
Bibliographic reference. Huyse, Aurélie / Berthommier, Frédéric / Leybaert, Jacqueline (2015): "i do not see what you are saying: reduced visual influence on mulimodal speech integration in children with SLI", In FAAVSP-2015, 22-27.