This project proposes to develop a driver simulator, which takes into account information about the user state of mind (level of attention, fatigue state, stress state). The users state of mind analysis is based on video data and physiological signals. Facial movements such as eyes blinking, yawning, head rotations are detected on video data: they are used in order to evaluate the fatigue and attention level of the driver. The users electrocardiogram and galvanic skin response are recorded and analyzed in order to evaluate the stress level of the driver. A driver simulator software is modified in order to be able to appropriately react to these critical situations of fatigue and stress: some visual messages are sent to the driver, wheel vibrations are generated and the driver is supposed to react to the alertness messages. A flexible and efficient multi threaded server architecture is proposed to support multi messages sent by different modalities. Strategies for data fusion and fission are also provided. Some of these components are integrated within the first prototype of OpenInterface (the Multimodal Similar platform).
Cite as: Benoit, A., Bonnaud, L., Caplier, A., Ngo, P., Lawson, L., Trevisan, D.G., Levacic, V., Mancas, C., Chanel, G. (2005) Multimodal focus attention detection in an augmented driver simulator. Proc. Summer Workshop on Multimodal Interfaces (eINTERFACE 2005), 34-43
@inproceedings{benoit05_einterface, author={Alexandre Benoit and Laurent Bonnaud and Alice Caplier and Phillipe Ngo and Lionel Lawson and Daniela G. Trevisan and Vjekoslav Levacic and Céline Mancas and Guillaume Chanel}, title={{Multimodal focus attention detection in an augmented driver simulator}}, year=2005, booktitle={Proc. Summer Workshop on Multimodal Interfaces (eINTERFACE 2005)}, pages={34--43} }