Auditory-Visual Speech Processing (AVSP'98)
December 4-6, 1998
This study examined the manners of audiovisual speech perception, using the 'McGurk effect', when the speakers were foreigners. The McGurk effect demonstrates that visual (lip movement) information is used during speech perception even when it is discrepant with auditory information. Subjects, 17 Chinese and 23 Japanese reported what they heard while looking at and listening to the speakers[HEX 146] face on the monitor. There were 4 speakers, 2 Chinese and 2 Japanese. All the stimuli were made of one syllable utterance. Half of them were audio-visually compatible stimuli, but half of them were audio-visually incompatible stimuli. The results indicate that the Japanese subjects used more visual information on speech perception when the speakers were foreigners. But the Chinese subjects did not show such language asymmetry.
Bibliographic reference. Hayashi, Yasuko / Sekiyama, Kaoru (1998): "Native-foreign langage effect in the McGurk effect : a test with Chinese and Japanese", In AVSP-1998, 61-66.