Interspeech'2005 - Eurospeech
Few studies exist on the topic of emotion encoding in speech in the articulatory domain. In this report, we collect and analyze the movement data of the tongue tip, the jaw and the lower lip, along with speech, and investigate differences in speech articulation among four emotion types; neutral, anger, sadness and happiness. The effectiveness of the articulatory parameters in emotion classification was also investigated. It was observed that the tongue tip, jaw and lip positioning become more advanced when emotionally charged. This tendency was especially prominent for the tongue tip and jaw movements associated with sad speech. Angry speech was characterized by greater ranges of displacement and velocity, while it was opposite for sad speech. Happy speech was comparable in articulation to the neutral speech, but showed the widest range of pitch variation. It, however, remains to be seen if there is a trade-off between articulatory activity and voicing activity in emotional speech production. Multiple discriminant analysis showed that emotion is better classified in the articulatory domain. One probable reason is that the independency in the manipulation of each articulator may provide more degrees of freedom and less overlap in the articulatory parameter space. Vowel /IY/ was less responsive to emotional changes when compared to other peripheral vowels. This illustrates that the articulatory configuration associated with a vowel determines the effect of emotion on that vowel in the acoustic domain. Effects of emotions on each articulatory parameter were fairly systematic across vowels. It is not clear yet if that is a general tendency in emotional speech production or just a speaker-dependent characteristic.
Bibliographic reference. Lee, Sungbok / Yildirim, Serdar / Kazemzadeh, Abe / Narayanan, Shrikanth (2005): "An articulatory study of emotional speech production", In INTERSPEECH-2005, 497-500.