Auditory-Visual Speech Processing (AVSP'99)
August 7-10, 1999
Previous studies in audiovisual speech perception have concentrated mainly on situations in which the auditory information is degraded or presented in noise, or situations in which the visual and auditory information are in conflict, as in the McGurk effect. Relatively little work has addressed the issue of how the availability of visual information might contribute to levels of processing beyond that of the phonetic. We report here on some initial results from a series of studies targeting audiovisual processing in lexical and sentential structures. We also present pilot data that indicate visual information may speed reaction times in phoneme monitoring for targets presented in neutral sentence contexts.
Bibliographic reference. Cox, Ethan A. / Norrix, Linda W. / Green, Kerry P. (1999): "The contribution of visual information to on-line sentence processing: Evidence from phoneme monitoring", In AVSP-1999, paper #5.