Главная
Study mode:
on
1
Introduction
2
What is Optimization Generalization
3
Classical Machine Learning
4
Deep Learning
5
Content
6
Gradient Flow
7
Endtoend Dynamics
8
Conventional Approach
9
Implicit Preconditioning
10
Gradient Descent
11
Depth
12
Matrix Completion
13
Deep Matrix Factorization
14
Experiments
15
Dynamics of Singular Values
16
Matrix Completion Problem
17
Singular Value Dynamics
18
Recap
19
Nonlinearity
Description:
Explore the dynamics of gradient descent in deep learning through a comprehensive lecture that delves into optimization and generalization. Analyze the interplay between classical machine learning and deep learning approaches, focusing on gradient flow and end-to-end dynamics. Investigate implicit preconditioning, deep matrix factorization, and the dynamics of singular values in matrix completion problems. Gain insights into the role of nonlinearity in deep learning optimization and generalization, presented by Nadav Cohen from Tel-Aviv University as part of the "Learning and Testing in High Dimensions" series at the Simons Institute.

Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient Descent

Simons Institute
Add to list