INTERSPEECH 2011
12th Annual Conference of the International Speech Communication Association

Florence, Italy
August 27-31. 2011

Using Crowdsourcing to Provide Prosodic Annotations for Non-Native Speech

Keelan Evanini, Klaus Zechner

Educational Testing Service, USA

We present the results of an experiment in which 2 expert and 11 naive annotators provided prosodic annotations for stress and boundary tones on a corpus of spontaneous speech produced by non-native speakers of English. The results show that agreement rates were higher for boundary tones than for stress. In addition, a crowdsourcing approach was implemented to combine the naive annotations to increase accuracy. The crowdsourcing approach was able to match expert agreement for stress (62.1%) with 3 naive annotators, and come within 7.2% of expert agreement for boundary tones (82.4%) with 11 naive annotators. This experiment also demonstrates that noticeable improvements in naive annotations can be obtained with a small amount of additional training.

Full Paper

Bibliographic reference.  Evanini, Keelan / Zechner, Klaus (2011): "Using crowdsourcing to provide prosodic annotations for non-native speech", In INTERSPEECH-2011, 3069-3072.