Machine Listening in Multisource Environments (CHiME) 2011
We consider Gaussian mixture model (GMM)-based classification from noisy features, where the uncertainty over each feature is represented by a Gaussian distribution. For that purpose, we first propose a new GMM training and decoding criterion called log-likelihood integration which, as opposed to the conventional likelihood integration criterion, does not rely on any assumption regarding the distribution of the data. Secondly, we introduce two new Expectation Maximization (EM) algorithms for the two criteria, that allow to learn GMMs directly from noisy features. We then evaluate and compare the behaviors of two proposed algorithms with a categorization task on artificial data and speech data with additive artificial noise, assuming the uncertainty parameters are known.
Experiments demonstrate the superiority of the likelihood integration criterion with the newly proposed EM learning in all tested configurations, thus giving rise to a new family of learning approaches that are insensitive to the heterogeneity of the noise characteristics between testing and training data.
Index Terms. Uncertainty-based classification, Gaussian mixture model, expectation maximization algorithm
Full Paper Slides
Bibliographic reference. Ozerov, Alexey / Lagrange, Mathieu / Vincent, Emmanuel (2011): "GMM-based classification from noisy features", In CHiME-2011, 30-35.