Automatic Hierarchical Attention Neural Network for Detecting AD

Yilin Pan, Bahman Mirheidari, Markus Reuber, Annalena Venneri, Daniel Blackburn, Heidi Christensen


Picture description tasks are used for the detection of cognitive decline associated with Alzheimer’s disease (AD). Recent years have seen work on automatic AD detection in picture descriptions based on acoustic and word-based analysis of the speech. These methods have shown some success but lack an ability to capture any higher level effects of cognitive decline on the patient’s language. In this paper, we propose a novel model that encompasses both the hierarchical and sequential structure of the description and detect its informative units by attention mechanism. Automatic speech recognition (ASR) and punctuation restoration are used to transcribe and segment the data. Using the DementiaBank database of people with AD as well as healthy controls (HC), we obtain an F-score of 84.43% and 74.37% when using manual and automatic transcripts respectively. We further explore the effect of adding additional data (a total of 33 descriptions collected using a ‘ digital doctor’ ) during model training, and increase the F-score when using ASR transcripts to 76.09%. This outperforms baseline models, including bidirectional LSTM and bidirectional hierarchical neural network without an attention mechanism, and demonstrate that the use of hierarchical models with attention mechanism improves the AD/HC discrimination performance.


 DOI: 10.21437/Interspeech.2019-1799

Cite as: Pan, Y., Mirheidari, B., Reuber, M., Venneri, A., Blackburn, D., Christensen, H. (2019) Automatic Hierarchical Attention Neural Network for Detecting AD. Proc. Interspeech 2019, 4105-4109, DOI: 10.21437/Interspeech.2019-1799.


@inproceedings{Pan2019,
  author={Yilin Pan and Bahman Mirheidari and Markus Reuber and Annalena Venneri and Daniel Blackburn and Heidi Christensen},
  title={{Automatic Hierarchical Attention Neural Network for Detecting AD}},
  year=2019,
  booktitle={Proc. Interspeech 2019},
  pages={4105--4109},
  doi={10.21437/Interspeech.2019-1799},
  url={http://dx.doi.org/10.21437/Interspeech.2019-1799}
}