In this paper, we extend our previous analysis of Gaussian Mixture Model (GMM) subspace compensation techniques using Gaussian modeling in the supervector space combined with additive channel and observation noise. We show that under the modeling assumptions of a totalvariability i-vector system, full Gaussian supervector scoring can also be performed cheaply in the total subspace, and that i-vector scoring can be viewed as an approximation to this. Next, we show that covariance matrix estimation in the i-vector space can be used to generate PCA estimates of supervector covariance matrices needed for Joint Factor Analysis (JFA). Finally, we derive a new technique for reduced-dimension i-vector extraction which we call Supervector LDA (SV-LDA), and demonstrate a 100-dimensional i-vector language recognition system with equivalent performance to a 600-dimensional version at much lower complexity.
Index Terms: language recognition, Gaussian mixture model, Wiener filter, factor analysis, i-vector, LDA
Bibliographic reference. McCree, Alan / Borgström, Bengt (2012): "Supervector LDA: a new approach to reduced-complexity i-vector language recognition", In INTERSPEECH-2012, 46-49.