Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Grab it
Learn how AdaBoost, a powerful ensemble method for machine learning models, combines average learners into a superior model in this 18-minute educational video. Explore the fundamental concepts of bagging and boosting before diving into AdaBoost's core mechanics. Master techniques for rescaling mistakes, combining learners effectively, and implementing weighted voting systems. Understand the relationships between probability, odds, and logit functions in the context of ensemble learning. Follow along with hands-on examples using the provided codelab, which demonstrates practical implementation of random forests and AdaBoost algorithms. Access complementary learning materials through the accompanying GitHub repository and explore additional resources in the Grokking Machine Learning Book.
Understanding AdaBoost - Ensemble Learning with Weighted Classifiers