Interspeech'2005 - Eurospeech
A new method for tracking outer lip contours of individuals in real world conditions is presented. For an arbitrary speaker, lip color properties are learned for the Bayes decision from the current image frame using the nose tip location as a reference point. Estimated outer contour data is fit to an ellipsoid for further eliminating the effect of outliers in the contour. The algorithm, which is posed as a real-time solution to lip contour tracking in real world conditions, is made efficient by the use of an online learning method. Demonstrations of lip contour tracking and its application to a mouth movement imitation are presented.
Bibliographic reference. Gurbuz, Sabri (2005): "Real-time outer lip contour tracking for HCI applications", In INTERSPEECH-2005, 1217-1220.