11th Annual Conference of the International Speech Communication Association

Makuhari, Chiba, Japan
September 26-30. 2010

Analysis and Detection of Cognitive Load and Frustration in Drivers' Speech

Hynek Bořil (1), Seyed Omid Sadjadi (1), Tristan Kleinschmidt (2), John H. L. Hansen (1)

(1) University of Texas at Dallas, USA
(2) Queensland University of Technology, Australia

Non-driving related cognitive load and variations of emotional state may impact the driverís capability to control a vehicle and introduce driving errors. Availability of reliable cognitive load and emotion detection in drivers would benefit the design of active safety systems and other intelligent in-vehicle interfaces. In this study, speech produced by 68 subjects while driving in urban areas is analyzed. A particular focus is on speech production differences in two secondary cognitive tasks, interactions with a co-driver and calls to automated spoken dialog systems (SDS), and two emotional states during the SDS interactions - neutral/negative. A number of speech parameters are found to vary across the cognitive /emotion classes. Suitability of selected spectral- and production-based features for automatic cognitive task/emotion classification is investigated. A fusion of GMM/SVM classifiers yields an accuracy of 89% in cognitive task and 76% in emotion classification.

Full Paper

Bibliographic reference.  Bořil, Hynek / Sadjadi, Seyed Omid / Kleinschmidt, Tristan / Hansen, John H. L. (2010): "Analysis and detection of cognitive load and frustration in drivers' speech", In INTERSPEECH-2010, 502-505.