EUROSPEECH 2003 - INTERSPEECH 2003
8th European Conference on Speech Communication and Technology

Geneva, Switzerland
September 1-4, 2003

        

Robust Multi-Class Boosting

Gunnar Ratsch

Fraunhofer FIRST, Germany

Boosting approaches are based on the idea that high-quality learning algorithms can be formed by repeated use of a "weak-learner", which is required to perform only slightly better than random guessing. It is known that Boosting can lead to drastic improvements compared to the individual weak-learner. For two-class problems it has been shown that the original Boosting algorithm, called AdaBoost, is quite unaffected by overfitting. However, for the case of noisy data, it is also understood that AdaBoost can be improved considerably by introducing some regularization technique. In speech-related problems one often considers multi-class problems and Boosting formulations have been used successfully to solve them. I review existing multi-class boosting algorithms, which have been much less analyzed and explored than the two-class pendants. In this work I extend these methods to derive new boosting algorithms which are more robust against outliers and noise in the data and are able to exploit prior knowledge about relationships between the classes.

Full Paper

Bibliographic reference.  Ratsch, Gunnar (2003): "Robust multi-class boosting", In EUROSPEECH-2003, 997-1000.