Recent studies demonstrated that lip and jaw movements during speech may provide important information for the diagnosis of amyotrophic lateral sclerosis (ALS) and for understanding its progression. A thorough investigation of these movements is essential for the development of intelligent video- or optically-based facial tracking systems that could assist with early diagnosis and progress monitoring. In this paper, we investigated the potential for a novel and expanded set of kinematic features obtained from lips and jaw to classify articulatory data into three stages of bulbar disease progression (i.e., pre-symptomatic, early symptomatic, and late symptomatic). Feature selection methods (Relief-F and mRMR) and classification algorithm (SVM) were used for this purpose. Results showed that even with a limited number of kinematic features it was possible to obtain good classification accuracy (nearly 80%). Given the recent development of video-based markerless methods for tracking speech movements, these results provide strong rationale for supporting the development of portable and cheap systems for monitoring the orofacial function in ALS.
Cite as: Bandini, A., Green, J.R., Zinman, L., Yunusova, Y. (2017) Classification of Bulbar ALS from Kinematic Features of the Jaw and Lips: Towards Computer-Mediated Assessment. Proc. Interspeech 2017, 1819-1823, doi: 10.21437/Interspeech.2017-478
@inproceedings{bandini17b_interspeech, author={Andrea Bandini and Jordan R. Green and Lorne Zinman and Yana Yunusova}, title={{Classification of Bulbar ALS from Kinematic Features of the Jaw and Lips: Towards Computer-Mediated Assessment}}, year=2017, booktitle={Proc. Interspeech 2017}, pages={1819--1823}, doi={10.21437/Interspeech.2017-478} }