Auditory-Visual Speech Processing (AVSP) 2011
We describe a novel framework to detect ball hits in a tennis game by combining audio and visual information. Ball hit detection is a key step in understanding a game such as tennis, but single-mode approaches are not very successful: audio detection suffers from interfering noise and acoustic mismatch, video detection is made difficult by the small size of the ball and the complex background of the surrounding environment. Our goal in this paper is to improve detection performance by focusing on high-level information (rather than low-level features), including the detected audio events, the balls trajectory, and inter-event timing information. Visual information supplies coarse detection of the ball-hits events. This information is used as a constraint for audio detection. In addition, useful gains in detection performance can be obtained by using and inter-ballhit timing information, which aids prediction of the next ball hit. This method seems to be very effective in reducing the interference present in low-level features. After applying this method to a womens doubles tennis game, we obtained improvements in the F-score of about 30% (absolute) for audio detection and about 10% for video detection.
Index Terms. Scene analysis, multimodal information integration
Bibliographic reference. Huang, Qiang / Cox, Stephen / Yan, Fei / Campos, Teo de / Windridge, David / Kittler, Josef / Christmas, William (2011): "Improved detection of ball hit events in a tennis game using multimodal information", In AVSP-2011, 127-130.