Главная
Study mode:
on
1
Introduction
2
Optimization Problems
3
Averagecase Complexity
4
Randomness
5
Distribution
6
SGD Example
7
SGD Worst Case
8
SGD Congruence Theorem
9
Stepsize Criticality
10
Average Case Complexity
11
Stochastic Momentum
12
Stochastic Heavy Ball
13
Momentum Parameters
14
Dimension Dependent Momentum
15
Thank You
16
Logistic Regression
17
Averagecase Analysis
Description:
Explore the intricacies of Stochastic Gradient Descent (SGD) in a comprehensive lecture delivered by Courtney Paquette from McGill University at the Machine Learning Advances and Applications Seminar. Delve into average-case analysis, asymptotics, and stepsize criticality as key components of SGD. Examine optimization problems, average-case complexity, and the role of randomness and distribution in SGD. Investigate the SGD congruence theorem and its implications for worst-case scenarios. Uncover the nuances of stepsize criticality and its impact on average-case complexity. Learn about stochastic momentum, the stochastic heavy ball method, and the significance of momentum parameters. Discover dimension-dependent momentum and its applications in logistic regression. Gain valuable insights into the average-case analysis of SGD and its relevance in machine learning applications.

SGD in the Large - Average-Case Analysis, Asymptotics, and Stepsize Criticality

Fields Institute
Add to list