Главная
Study mode:
on
1
Intro
2
Overparameterized models in machine learning
3
An experiment
4
Training overparameterized neural nets
5
Approximation theory perspective
6
Infinite-width two-layer ReLU nets
7
Learning with norm-controlled infinite-width ReLU networks
8
From Two-layer ReLU Nets to Convex Nets
9
Intuition in 1D
10
Intuition in Higher Dimensions
11
The Radon Transform in 2D
12
Radon Transform as Line Detector
13
Key Derivation
14
Example
15
Implications: Comparison to Kernel Learning
16
Implications: Depth Separation Result
17
Open Questions
Description:
Explore a function space perspective on overparameterized neural networks in this 35-minute conference talk by Rebecca Willet from the University of Chicago. Delve into the intriguing phenomenon of vastly overparameterized neural networks generalizing well despite their capacity to fit any labels. Examine the role of weight magnitude control in complexity regulation and investigate the functions approximated by neural networks with bounded weight norms. Discover a precise characterization of functions realizable by two-layer ReLU networks with bounded Euclidean norm weights, drawing surprising connections to the Radon transform used in computational imaging. Learn how Radon transform analysis provides novel insights into learning with two and three-layer ReLU networks. Gain understanding of topics such as infinite-width ReLU nets, norm-controlled learning, convex nets, and depth separation results. Conclude with open questions in this cutting-edge area of machine learning research presented at the Alan Turing Institute. Read more

A Function Space View of Overparameterized Neural Networks - Rebecca Willet, University of Chicago

Alan Turing Institute
Add to list
0:00 / 0:00