International Conference on Auditory-Visual Speech Processing 2008

Tangalooma Wild Dolphin Resort, Moreton Island, Queensland, Australia
September 26-29, 2008

Hearing a Talking Face: An Auditory Influence on a Visual Detection Task

Jeesun Kim, Christian Kroos, Chris Davis

MARCS Auditory Laboratories, University of Western Sydney, Australia

Parsing of information from the world into objects and events occurs in both the visual and auditory modalities. It has been suggested that visual and auditory scene perception involve similar principles of perceptual organization. This study investigated cross-modal scene perception by determining whether an auditory stimulus could facilitate visual object segregation. Specifically, we examined whether the presentation of matched auditory speech would facilitate the detection of a point-light talking face amid point-light distractors. An adaptive staircase procedure (3 up 1 down rule) was used to estimate the 79% correct threshold in a twoalternative forced-choice (2AFC) procedure. To determine if different degrees of speech motion would show different sized auditory influence, three speech modes were tested (speech in quiet; whispered and Lombard speech). A facilitatory auditory effect on talking face detection was found; the size of this effect did not differ across the different speech modes.

Full Paper

Bibliographic reference.  Kim, Jeesun / Kroos, Christian / Davis, Chris (2008): "Hearing a talking face: an auditory influence on a visual detection task", In AVSP-2008, 107-110.