ISCA Archive Interspeech 2021
ISCA Archive Interspeech 2021

End-to-End Neural Diarization: From Transformer to Conformer

Yi Chieh Liu, Eunjung Han, Chul Lee, Andreas Stolcke

We propose a new end-to-end neural diarization (EEND) system that is based on Conformer, a recently proposed neural architecture that combines convolutional mappings and Transformer to model both local and global dependencies in speech. We first show that data augmentation and convolutional subsampling layers enhance the original self-attentive EEND in the Transformer-based EEND, and then Conformer gives an additional gain over the Transformer-based EEND. However, we notice that the Conformer-based EEND does not generalize as well from simulated to real conversation data as the Transformer-based model. This leads us to quantify the mismatch between simulated data and real speaker behavior in terms of temporal statistics reflecting turn-taking between speakers, and investigate its correlation with diarization error. By mixing simulated and real data in EEND training, we mitigate the mismatch further, with Conformer-based EEND achieving 24% error reduction over the baseline SA-EEND system, and 10% improvement over the best augmented Transformer-based system, on two-speaker CALLHOME data.


doi: 10.21437/Interspeech.2021-1909

Cite as: Liu, Y.C., Han, E., Lee, C., Stolcke, A. (2021) End-to-End Neural Diarization: From Transformer to Conformer. Proc. Interspeech 2021, 3081-3085, doi: 10.21437/Interspeech.2021-1909

@inproceedings{liu21j_interspeech,
  author={Yi Chieh Liu and Eunjung Han and Chul Lee and Andreas Stolcke},
  title={{End-to-End Neural Diarization: From Transformer to Conformer}},
  year=2021,
  booktitle={Proc. Interspeech 2021},
  pages={3081--3085},
  doi={10.21437/Interspeech.2021-1909}
}