Главная
Study mode:
on
1
Intro
2
The Problem
3
Generation Gap
4
Predicting Generalization
5
What makes generalizations
6
Generalization Balance
7
Generalization Correlation
8
Experimental Design
9
HighLevel Design
10
Overall Rank Correlation
11
Results
12
Hypothesis
13
Canonical Ordering
14
Path Norm
15
Flatness Based Measures
16
Protection Based Measures
17
Optimization
18
Negative Correlation
19
Gradient Noise
20
Summary
21
Conclusion
Description:
Explore a thought-provoking conference talk on the causal analysis of generalization in deep learning, presented by Behnam Neyshabur from Google at the Workshop on Theory of Deep Learning: Where next? Delve into the intricacies of the generalization problem, examining concepts such as the generation gap, predicting generalization, and factors influencing generalization balance and correlation. Gain insights into the experimental design, including high-level design and overall rank correlation, as well as the results and hypotheses proposed. Investigate various measures and concepts, including canonical ordering, path norm, flatness-based measures, protection-based measures, optimization, negative correlation, and gradient noise. Conclude with a comprehensive summary of the key findings and their implications for the field of deep learning.

Toward a Causal Analysis of Generalization in Deep Learning - Behnam Neyshabur

Institute for Advanced Study
Add to list
0:00 / 0:00