Главная
Study mode:
on
1
Intro
2
Overview
3
Greedy Algorithm Limitations
4
Approximate Greedy Algorithm
5
Weighted Quantile Sketch
6
What is a Weighted Quantile
7
Weighted Quantiles in Classification
8
SparsityAware Split Finding
9
CacheAware Access
10
Core Computation
11
Random Subsets
12
Summary
Description:
Explore advanced optimizations for XGBoost in this fourth and final video of the series. Dive into techniques for handling large training datasets, including the Approximate Greedy Algorithm, Parallel Learning, Weighted Quantile Sketch, Sparsity-Aware Split Finding, Cache-Aware Access, and Blocks for Out-of-Core Computation. Learn step-by-step how XGBoost efficiently manages missing data, utilizes default paths, and optimizes performance for massive datasets. Gain insights into the practical applications of these advanced concepts in machine learning and data analysis.

XGBoost Part 4 - Crazy Cool Optimizations

StatQuest with Josh Starmer
Add to list
0:00 / 0:00