Главная
Study mode:
on
1
Intro
2
What is optimization
3
Generalization
4
First Order Optimization
5
Training of infinitely wide deep nets
6
Neural Tangent Kernel NTK
7
Neural Tangent Kernel Details
8
Kernel Linear Regression
9
Matrix Completion
10
Matrix Inflation
11
Deep Linear Net
12
Great in the Sense
13
Learning Rates
14
Formal Statements
15
Connectivity
16
Conclusions
Description:
Explore a thought-provoking lecture on deep learning theory delivered by Princeton University's Sanjeev Arora at the Institute for Advanced Study. Delve into the question of whether optimization is the most appropriate framework for understanding deep learning. Examine key concepts including generalization, first-order optimization, and the Neural Tangent Kernel (NTK). Investigate the training of infinitely wide deep nets, kernel linear regression, and matrix completion. Analyze deep linear networks, learning rates, and formal statements related to connectivity. Gain insights into the current state and future directions of deep learning theory through this comprehensive exploration of optimization's role in understanding neural networks.

Is Optimization the Right Language to Understand Deep Learning? - Sanjeev Arora

Institute for Advanced Study
Add to list