Главная
Study mode:
on
1
Intro
2
Two Fundamental Questions
3
Empirical Observations on Training Loss
4
Over-parameterization
5
Empirical Observations on Generalization
6
Example: Two-layer NN
7
Trajectory-based Analysis
8
The Trajectory of Predictions (Cont'd)
9
Kernel Matrix at the Beginning
10
Kernel Matrix During Training
11
Main Theory
12
Zero Training Error
13
Empirical Results on Generalization
14
Convolutional Neural Tangent Kernel
15
CNTK on CIFAR 10
16
Understanding Global Average Pooling
17
Local Average Pooling
18
UCI Experiment Setup
19
UCI Results
20
Few-shot Learning Setup
21
Few-shot Learning Results
22
Graph NTK for Graph Classification
23
Summary
24
References
Description:
Explore a comprehensive lecture on the modern perspective of the connection between neural networks and kernels. Delve into fundamental questions, empirical observations on training loss and generalization, and over-parameterization in neural networks. Examine trajectory-based analysis, kernel matrices, and the main theory behind zero training error. Investigate empirical results on generalization, convolutional neural tangent kernels, and their application to CIFAR-10. Understand global and local average pooling, and explore experiments on UCI datasets and few-shot learning. Discover graph neural tangent kernels for graph classification, and gain insights into the latest research and references in this field.

On the Connection Between Neural Networks and Kernels: A Modern Perspective - Simon Du

Institute for Advanced Study
Add to list
0:00 / 0:00