7th International Conference on Spoken Language Processing

September 16-20, 2002
Denver, Colorado, USA

Accumulated Kullback Divergence for Analysis of ASR Performance in the Presence of Noise

Febe de Wet, Johan de Veth, Bert Cranen, Lou Boves

University of Nijmegen, The Netherlands

In this paper, the accumulated Kullback divergence (AKD) is used to analyze ASR performance deterioration due to the presence of background noise. The AKD represents a distance between the feature value distribution observed during training and the distribution of the observations in the noisy test condition for each individual feature vector component. In our experiments the AKD summed over all feature vector components shows a high correlation with word error rate and AKD computed per component can be used to pinpoint those feature vector components that substantially contribute to recognition errors. It is argued that the distance measure could be a useful evaluation tool for analyzing the strengths and weaknesses of existing noise robustness approaches and might help to suggest research strategies that focus on those elements of the acoustic feature vector that are most severely affected by the noise.


Full Paper

Bibliographic reference.  Wet, Febe de / Veth, Johan de / Cranen, Bert / Boves, Lou (2002): "Accumulated kullback divergence for analysis of ASR performance in the presence of noise", In ICSLP-2002, 1069-1072.