Non-driving related cognitive load and variations of emotional state may impact the driverís capability to control a vehicle and introduce driving errors. Availability of reliable cognitive load and emotion detection in drivers would benefit the design of active safety systems and other intelligent in-vehicle interfaces. In this study, speech produced by 68 subjects while driving in urban areas is analyzed. A particular focus is on speech production differences in two secondary cognitive tasks, interactions with a co-driver and calls to automated spoken dialog systems (SDS), and two emotional states during the SDS interactions - neutral/negative. A number of speech parameters are found to vary across the cognitive /emotion classes. Suitability of selected spectral- and production-based features for automatic cognitive task/emotion classification is investigated. A fusion of GMM/SVM classifiers yields an accuracy of 89% in cognitive task and 76% in emotion classification.
Bibliographic reference. Bořil, Hynek / Sadjadi, Seyed Omid / Kleinschmidt, Tristan / Hansen, John H. L. (2010): "Analysis and detection of cognitive load and frustration in drivers' speech", In INTERSPEECH-2010, 502-505.