Главная
Study mode:
on
1
Intro
2
Overview
3
Initial Prediction
4
Similarity Scores
5
Building a Tree
6
Similarity Score
7
Gain
8
Cover
9
Cover for Classification
10
Pruning
11
Classification
12
Logistic Regression
13
Summary
Description:
Dive into the second part of a four-part video series on XGBoost, focusing on classification techniques. Learn how XGBoost trees are constructed for classification problems, building upon the regression concepts covered in part one. Explore key topics such as initial predictions, similarity scores, tree building, gain calculation, cover for classification, pruning, and the application of logistic regression. Gain a deeper understanding of how XGBoost adapts its algorithms for classification tasks, assuming prior knowledge of XGBoost trees for regression, gradient boost for classification, odds and log-odds, and the logistic function.

XGBoost Part 2 - Classification

StatQuest with Josh Starmer
Add to list
0:00 / 0:00