An experiment is reported that extends the work of Grant Seitz [1]. Grant and Seitz employed an elegant method for measuring AV effects using a detection paradigm, however the force of their results was diminished because: (i) they used only a small number of test sentences (three) and (ii) a key feature of their results, the relation of the magnitude of masking release with degree of correlation between aspects of speech (intensity) and vision (lip movement), was post hoc. In this paper we used eight stimuli that were explicitly selected to contrast the degree of correlation between speech intensity and lip movement. In order to minimize expectancy effects, the speech materials were in a language unknown to the participants and a method of constant stimuli was adopted. The results supported those of Grant and Seitz: seeing the face of the speaker facilitated detection and this facilitation was best for the stimuli where the correlation between F3 and inter-lip distance was high.
Cite as: Kim, J., Davis, C. (2001) Visible speech cues and auditory detection of spoken sentences: an effect of degree of correlation between acoustic and visual properties. Proc. Auditory-Visual Speech Processing, 127-131
@inproceedings{kim01b_avsp, author={Jeesun Kim and Chris Davis}, title={{Visible speech cues and auditory detection of spoken sentences: an effect of degree of correlation between acoustic and visual properties}}, year=2001, booktitle={Proc. Auditory-Visual Speech Processing}, pages={127--131} }