Главная
Study mode:
on
1
Intro
2
deep nets and splines
3
spline approximation
4
kinds of splines
5
max-affine spline (MAS)
6
max-affine spline operator (MASO)
7
theorem
8
MASO spline partition
9
learning
10
geometry of the MASO partition
11
a conclusion
12
local affine mapping - CNN
13
deep nets are matched filterbanks
14
data memorization
15
deep net complexity
16
understanding data augmentation
17
beyond piecewise affine nets
Description:
Explore the connections between deep neural networks and spline theory in this 48-minute lecture by Richard Baraniuk from Rice University. Delve into the fundamentals of deep nets and splines, focusing on max-affine splines (MAS) and max-affine spline operators (MASO). Examine spline approximation techniques and various types of splines. Investigate the MASO spline partition and its role in learning, as well as the geometry of MASO partitions. Discover how convolutional neural networks (CNNs) relate to local affine mappings and how deep nets function as matched filterbanks. Analyze concepts such as data memorization, deep net complexity, and the impact of data augmentation. Gain insights into piecewise affine nets and explore potential future directions in deep learning research.

Mad Max - Affine Spline Insights into Deep Learning

Simons Institute
Add to list
0:00 / 0:00