Material Detail
The Dynamics of AdaBoost
This video was recorded at Machine Learning Summer School (MLSS), Chicago 2005. One of the most successful and popular learning algorithms is AdaBoost, which is a classification algorithm designed to construct a "strong" classifier from a "weak" learning algorithm. Just after the development of AdaBoost nine years ago, scientists derived margin- based generalization bounds to explain AdaBoost's unexpectedly good performance. Their result predicts that AdaBoost yields the best possible performance if it always achieves a "maximum margin" solution. Yet, does AdaBoost achieve a maximum margin solution? Empirical and theoretical studies conducted within this period conjecture the answer to be "yes". In order to answer this question, we look toward AdaBoost's dynamics. We simplify AdaBoost to reveal a nonlinear iterated map. We then analyze the convergence of AdaBoost for cases where cyclic behavior is found; this cyclic behavior provides the key to answering the question of whether AdaBoost always maximizes the margin. As it turns out, the answer to this question turns out to be the opposite of what was thought to be true! In this talk, I will introduce AdaBoost, describe our analysis of AdaBoost when viewed as a dynamical system, briefly mention a new boosting algorithm which always maximizes the margin with a fast convergence rate, and if time permits, I will reveal a surprising new result about AdaBoost and the problem of bipartite ranking.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info