ISCA Archive Interspeech 2008
ISCA Archive Interspeech 2008

Patterns, prototypes, performance: classifying emotional user states

Dino Seppi, Anton Batliner, Björn Schuller, Stefan Steidl, Thurid Vogt, Johannes Wagner, Laurence Devillers, Laurence Vidrascu, Noam Amir, Vered Aharonson

In this paper, we report on classification results for emotional user states (4 classes, German database of children interacting with a pet robot). Starting with 5 emotion labels per word, we obtained chunks with different degrees of prototypicality. Six sites computed acoustic and linguistic features independently from each other. A total of 4232 features were pooled together and grouped into 10 low level descriptor types. For each of these groups separately and for all taken together, classification results using Support Vector Machines are reported for 150 features each with the highest individual Information Gain Ratio, for a scale of prototypicality. With both acoustic and linguistic features, we obtained a relative improvement of up to 27.6%, going from low to higher prototypicality.

doi: 10.21437/Interspeech.2008-193

Cite as: Seppi, D., Batliner, A., Schuller, B., Steidl, S., Vogt, T., Wagner, J., Devillers, L., Vidrascu, L., Amir, N., Aharonson, V. (2008) Patterns, prototypes, performance: classifying emotional user states. Proc. Interspeech 2008, 601-604, doi: 10.21437/Interspeech.2008-193

  author={Dino Seppi and Anton Batliner and Björn Schuller and Stefan Steidl and Thurid Vogt and Johannes Wagner and Laurence Devillers and Laurence Vidrascu and Noam Amir and Vered Aharonson},
  title={{Patterns, prototypes, performance: classifying emotional user states}},
  booktitle={Proc. Interspeech 2008},