Preference-Learning with Qualitative Agreement for Sentence Level Emotional Annotations

Srinivas Parthasarathy, Carlos Busso


The perceptual evaluation of emotional attributes is noisy due to inconsistencies between annotators. The low inter-evaluator agreement arises due to the complex nature of emotions. Conventional approaches average scores provided by multiple annotators. While this approach reduces the influence of dissident annotations, previous studies have showed the value of considering individual evaluations to better capture the underlying ground-truth. One of these approaches is the qualitative agreement (QA) method, which provides an alternative framework that captures the inherent trends amongst the annotators. While previous studies have focused on using the QA method for time-continuous annotations from a fixed number of annotators, most emotional databases are annotated with attributes at the sentence-level (e.g., one global score per sentence). This study proposes a novel formulation based on the QA framework to estimate reliable sentence-level annotations for preference-learning. The proposed relative labels between pairs of sentences capture consistent trends across evaluators. The experimental evaluation shows that preference-learning methods to rank-order emotional attributes trained with the proposed QA-based labels achieve significantly better performance than the same algorithms trained with relative scores obtained by averaging absolute scores across annotators. These results show the benefits of QA-based labels for preference-learning using sentence-level annotations.


 DOI: 10.21437/Interspeech.2018-2478

Cite as: Parthasarathy, S., Busso, C. (2018) Preference-Learning with Qualitative Agreement for Sentence Level Emotional Annotations. Proc. Interspeech 2018, 252-256, DOI: 10.21437/Interspeech.2018-2478.


@inproceedings{Parthasarathy2018,
  author={Srinivas Parthasarathy and Carlos Busso},
  title={Preference-Learning with Qualitative Agreement for Sentence Level Emotional Annotations},
  year=2018,
  booktitle={Proc. Interspeech 2018},
  pages={252--256},
  doi={10.21437/Interspeech.2018-2478},
  url={http://dx.doi.org/10.21437/Interspeech.2018-2478}
}