Two-Dimensional Convolutional Recurrent Neural Networks for Speech Activity Detection

Anastasios Vafeiadis, Eleftherios Fanioudakis, Ilyas Potamitis, Konstantinos Votis, Dimitrios Giakoumis, Dimitrios Tzovaras, Liming Chen, Raouf Hamzaoui


Speech Activity Detection (SAD) plays an important role in mobile communications and automatic speech recognition (ASR). Developing efficient SAD systems for real-world applications is a challenging task due to the presence of noise. We propose a new approach to SAD where we treat it as a two-dimensional multilabel image classification problem. To classify the audio segments, we compute their Short-time Fourier Transform spectrograms and classify them with a Convolutional Recurrent Neural Network (CRNN), traditionally used in image recognition. Our CRNN uses a sigmoid activation function, max-pooling in the frequency domain, and a convolutional operation as a moving average filter to remove misclassified spikes. On the development set of Task 1 of the 2019 Fearless Steps Challenge, our system achieved a decision cost function (DCF) of 2.89%, a 66.4% improvement over the baseline. Moreover, it achieved a DCF score of 3.318% on the evaluation dataset of the challenge, ranking first among all submissions.


 DOI: 10.21437/Interspeech.2019-1354

Cite as: Vafeiadis, A., Fanioudakis, E., Potamitis, I., Votis, K., Giakoumis, D., Tzovaras, D., Chen, L., Hamzaoui, R. (2019) Two-Dimensional Convolutional Recurrent Neural Networks for Speech Activity Detection. Proc. Interspeech 2019, 2045-2049, DOI: 10.21437/Interspeech.2019-1354.


@inproceedings{Vafeiadis2019,
  author={Anastasios Vafeiadis and Eleftherios Fanioudakis and Ilyas Potamitis and Konstantinos Votis and Dimitrios Giakoumis and Dimitrios Tzovaras and Liming Chen and Raouf Hamzaoui},
  title={{Two-Dimensional Convolutional Recurrent Neural Networks for Speech Activity Detection}},
  year=2019,
  booktitle={Proc. Interspeech 2019},
  pages={2045--2049},
  doi={10.21437/Interspeech.2019-1354},
  url={http://dx.doi.org/10.21437/Interspeech.2019-1354}
}