In dyadic human interactions, mutual influence a persons influence on the interacting partners behaviors is shown to be important and could be incorporated into the modeling framework in characterizing, and automatically recognizing the participants states. We propose a Dynamic Bayesian Network (DBN) to explicitly model the conditional dependency between two interacting partners emotion states in a dialog using data from the IEMOCAP corpus of expressive dyadic spoken interactions. Also, we focus on automatically computing the Valence-Activation emotion attributes to obtain a continuous characterization of the participants emotion flow. Our proposed DBN models the temporal dynamics of the emotion states as well as the mutual influence between speakers in a dialog. With speech based features, the proposed network improves classification accuracy by 3.67% absolute and 7.12% relative over the Gaussian Mixture Model (GMM) baseline on isolated turnby- turn emotion classification.
Cite as: Lee, C.-C., Busso, C., Lee, S., Narayanan, S.S. (2009) Modeling mutual influence of interlocutor emotion states in dyadic spoken interactions. Proc. Interspeech 2009, 1983-1986, doi: 10.21437/Interspeech.2009-480
@inproceedings{lee09e_interspeech, author={Chi-Chun Lee and Carlos Busso and Sungbok Lee and Shrikanth S. Narayanan}, title={{Modeling mutual influence of interlocutor emotion states in dyadic spoken interactions}}, year=2009, booktitle={Proc. Interspeech 2009}, pages={1983--1986}, doi={10.21437/Interspeech.2009-480} }