– Stride and skip: subsampling and convolution “à trous”
22
– Convolutional net architecture
23
– Multiple convolutions
24
– Vintage ConvNets
25
– How does the brain interpret images?
26
– Hubel & Wiesel's model of the visual cortex
27
– Invariance and equivariance of ConvNets
28
– In the next episode…
29
– Training time, iteration cycle, and historical remarks
Description:
Explore parameter sharing in recurrent and convolutional neural networks in this comprehensive 2-hour lecture by Yann LeCun. Delve into hypernetworks, shared weights, and gradient addition in parameter sharing. Examine recurrent nets, including unrolling in time, vanishing and exploding gradients, and RNN tricks. Investigate memory concepts, LSTM networks, and attention mechanisms for sequence-to-sequence mapping. Study convolutional nets, including motif detection, convolution definitions, backpropagation, and architecture. Learn about vintage ConvNets, brain image interpretation, and the Hubel & Wiesel model of the visual cortex. Gain insights into ConvNet invariance and equivariance, training time, iteration cycles, and historical remarks in deep learning.
Parameter Sharing - Recurrent and Convolutional Nets