Acoustic cue variability affects eye movement behaviour during non-native speech perception

Jessie S. Nixon, Catherine T. Best


A fundamental question in speech research is how listeners use continuous (non-discrete) acoustic cues to discriminate between discrete alternative messages. An important factor is the statistical distribution of acoustic cues in speech. Previous research has shown that when native speakers listen to speech with high within-category variability in the discriminative cue dimension, perceptual uncertainty increases, resulting in increased looks to competitor objects. The present study investigated effects of within-category acoustic variability on eye movements during acquisition of a non-native acoustic dimension, namely English speakers acquisition of lexical tone. All participants heard a bimodal distribution of stimuli, with distribution peaks at the prototypical pitch values for Cantonese high and mid level tones; however, presentation frequency differed between conditions: high-variance vs. low- variance. Based on previous research, we expected lower uncertainty and better learning in the low-variance condition. GAMM models showed that towards the end of the experiment, fixations were closer to the target object in the low-variance, compared to the high-variance condition. This suggests that within-category acoustic variability not only increases uncertainty for native listeners, but may also initially hinder learning of acoustic cues during non-native language acquisition.


 DOI: 10.21437/SpeechProsody.2018-100

Cite as: Nixon, J.S., Best, C.T. (2018) Acoustic cue variability affects eye movement behaviour during non-native speech perception. Proc. 9th International Conference on Speech Prosody 2018, 493-497, DOI: 10.21437/SpeechProsody.2018-100.


@inproceedings{Nixon2018,
  author={Jessie S. Nixon and Catherine T. Best},
  title={Acoustic cue variability affects eye movement behaviour during non-native speech perception},
  year=2018,
  booktitle={Proc. 9th International Conference on Speech Prosody 2018},
  pages={493--497},
  doi={10.21437/SpeechProsody.2018-100},
  url={http://dx.doi.org/10.21437/SpeechProsody.2018-100}
}