Statistics and Its Interface

Volume 2 (2009)

Number 3

Multi-class AdaBoost

Pages: 349 – 360

DOI: https://dx.doi.org/10.4310/SII.2009.v2.n3.a8

Authors

Trevor Hastie (Department of Statistics, Stanford University, Stanford, Calif., U.S.A.)

Saharon Rosset (Department of Statistics, Tel Aviv University, Tel Aviv, Israel)

Ji Zhu (Department of Statistics, University of Michigan, Ann Arbor, Mich., U.S.A.)

Hui Zou (School of Statistics, University of Minnesota, Minneapolis, Minn., U.S.A.)

Abstract

Boosting has been a very successful technique for solving the two-class classification problem. In going from two-class to multi-class classification, most algorithms have been restricted to reducing the multi-class classification problem to multiple two-class problems. In this paper, we develop a new algorithm that directly extends the AdaBoost algorithm to the multi-class case without reducing it to multiple two-class problems. We show that the proposed multi-class AdaBoost algorithm is equivalent to a forward stagewise additive modeling algorithm that minimizes a novel exponential loss for multi-class classification. Furthermore, we show that the exponential loss is a member of a class of Fisher-consistent loss functions for multi-class classification. As shown in the paper, the new algorithm is extremely easy to implement and is highly competitive in terms of misclassification error rate.

Keywords

boosting, exponential loss, multi-class classification, stagewise modeling

2010 Mathematics Subject Classification

62H30

Published 1 January 2009