An Affect Prediction Approach Through Depression Severity Parameter Incorporation in Neural Networks

Rahul Gupta, Saurabh Sahu, Carol Espy-Wilson, Shrikanth S. Narayanan


Humans use emotional expressions to communicate their internal affective states. These behavioral expressions are often multi-modal (e.g. facial expression, voice and gestures) and researchers have proposed several schemes to predict the latent affective states based on these expressions. The relationship between the latent affective states and their expression is hypothesized to be affected by several factors; depression disorder being one of them. Despite a wide interest in affect prediction, and several studies linking the effect of depression on affective expressions, only a limited number of affect prediction models account for the depression severity. In this work, we present a novel scheme that incorporates depression severity as a parameter in Deep Neural Networks (DNNs). In order to predict affective dimensions for an individual at hand, our scheme alters the DNN activation function based on the subject’s depression severity. We perform experiments on affect prediction in two different sessions of the Audio-Visual Depressive language Corpus, which involves patients with varying degree of depression. Our results show improvements in arousal and valence prediction on both the sessions using the proposed DNN modeling. We also present analysis of the impact of such an alteration in DNNs during training and testing.


 DOI: 10.21437/Interspeech.2017-120

Cite as: Gupta, R., Sahu, S., Espy-Wilson, C., Narayanan, S.S. (2017) An Affect Prediction Approach Through Depression Severity Parameter Incorporation in Neural Networks. Proc. Interspeech 2017, 3122-3126, DOI: 10.21437/Interspeech.2017-120.


@inproceedings{Gupta2017,
  author={Rahul Gupta and Saurabh Sahu and Carol Espy-Wilson and Shrikanth S. Narayanan},
  title={An Affect Prediction Approach Through Depression Severity Parameter Incorporation in Neural Networks},
  year=2017,
  booktitle={Proc. Interspeech 2017},
  pages={3122--3126},
  doi={10.21437/Interspeech.2017-120},
  url={http://dx.doi.org/10.21437/Interspeech.2017-120}
}