AVSP 2003 - International Conference on Audio-Visual Speech Processing

September 4-7, 2003
St. Jorioz, France

Early Processing of Visual Speech Information Modulates the Subsequent Processing of Auditory Speech Input at a Pre-Attentive Level: Evidence from Event-Related Brain Potential Data

Riadh Lebib, David Papo, Abdel Douiri, Stella de Bode, Pierre-Marie Baudonniere

(1) Departamento de Psicologķa Cognitiva, Universidad de La Laguna, Tenerife, Spain
(2) Neurosciences Cognitives et Imagerie Cerebrale LENA CNRS UPR 640 - Paris, France

In the present study, we examined the degree of attenuation/augmentation of the sensory response to congruent or non-congruent audiovisual speech stimuli as reflected by the P50 amplitudes in normal volunteers. We also analyse the early response to mouth movement onset, and dipole modeling was conducted to study the specific brain structures activated during early visual speech processing. The specific aims of the study were a) to show that speech processing starts as soon as lip movements occurred, b) to provide data that the brain can detect changes in incoming bimodal speech stimuli at either a pre-attentive or a very early attentive stage of information processing as reflected in the P50 component, and c) to confirm that this early detection is dependent upon the congruency status and the discriminability level of audiovisual speech input. Finally, the goal of this project was to generalize the concept of sensory gating with "real-life" stimuli, that is semantically-relevant and multimodal.


Full Paper

Bibliographic reference.  Lebib, Riadh / Papo, David / Douiri, Abdel / Bode, Stella de / Baudonniere, Pierre-Marie (2003): "Early processing of visual speech information modulates the subsequent processing of auditory speech input at a pre-attentive level: Evidence from event-related brain potential data", In AVSP 2003, 3-8.