Главная
Study mode:
on
1
- Introduction & Overview
2
- The Overwhelming Amount of Optimizers
3
- Compared Optimizers
4
- Default Parameters & Tuning Distribution
5
- Deep Learning Problems Considered
6
- Tuning on Single Seeds
7
- Results & Interpretation
8
- Learning Rate Schedules & Noise
9
- Conclusions & Comments
Description:
Explore a comprehensive analysis of deep learning optimizers in this 41-minute video lecture. Dive into the complex world of optimization algorithms for neural networks, comparing 14 popular methods in a standardized benchmark. Learn about the challenges of selecting the right optimizer, the importance of hyperparameter tuning, and the impact of different algorithms on various deep learning tasks. Gain insights into learning rate schedules, noise effects, and practical recommendations for choosing optimizers. Understand the key findings of the study, including the variability of optimizer performance across tasks and the effectiveness of evaluating multiple optimizers with default parameters. Discover a reduced subset of competitive algorithms that can guide your future deep learning projects.

Descending Through a Crowded Valley - Benchmarking Deep Learning Optimizers

Yannic Kilcher
Add to list
0:00 / 0:00