Auditory-Visual Speech Processing (AVSP'98)
December 4-6, 1998
The interactions observed in the ventriloquism situation suggest that visual and auditory signals are not processed as independently as the notion of separate senses would imply. A review of the conditions for pairing, the hypothetical mechanism underlying these interactions, argue for cognitive impenetrability and computational autonomy, the pairing rules being the Gestalt principles of common fate and proximity. There is much evidence in support of the view that auditory-visual integration is present early in life. Data from studies of the perinatal period, such as those on neonatal synesthesia, sensory deprivation, and sensory sur-stimulation as well as neuroanatomical evidence for transitory intersensory connections in the brain support the view that sensory modalities are bound together at birth and differentiate later, consistent with experience-expectant development. The discovery in the superior colliculus of different species of bimodal neurons governed by spatial and temporal rules similar to those underlying ventriloquism suggests a possible neural substrate. Differences between ventriloquism and speechreading are discussed.
Bibliographic reference. Radeau, Monique (1998): "Auditory-visual interactions in spatial scene analysis: development and neural bases", In AVSP-1998, 97-102.