Главная
Study mode:
on
1
Awesome song and introduction
2
The initial prediction
3
Building an XGBoost Tree for regression
4
Calculating Similarity Scores
5
Calculating Gain to evaluate different thresholds
6
Pruning an XGBoost Tree
7
Building an XGBoost Tree with regularization
8
Calculating output values for an XGBoost Tree
9
Making predictions with XGBoost
10
Summary of concepts and main ideas
11
I say "66", but I meant to say "62.48". However, either way, the conclusion is the same.
12
In the original XGBoost documents they use the epsilon symbol to refer to the learning rate, but in the actual implementation, this is controlled via the "eta" parameter. So, I guess to be consisten…
Description:
Dive into the first part of a four-part video series on XGBoost, focusing on its application to regression problems. Learn about the unique regression trees used in XGBoost, including initial predictions, tree building, similarity score calculations, gain evaluation for thresholds, tree pruning, regularization, output value calculations, and making predictions. Explore these concepts through clear explanations and visual aids, building on prior knowledge of Gradient Boost for Regression and Regularization. Gain a comprehensive understanding of XGBoost's approach to regression, preparing you for more advanced topics in subsequent videos.

XGBoost Part 1 - Regression

StatQuest with Josh Starmer
Add to list
0:00 / 0:00