Главная
Study mode:
on
1
01 – History and resources
2
01L – Gradient descent and the backpropagation algorithm
3
02 – Neural nets: rotation and squashing
4
02L – Modules and architectures
5
03 – Tools, classification with neural nets, PyTorch implementation
6
03L – Parameter sharing: recurrent and convolutional nets
7
04L – ConvNet in practice
8
04.1 – Natural signals properties and the convolution
9
04.2 – Recurrent neural networks, vanilla and gated (LSTM)
10
05L – Joint embedding method and latent variable energy based models (LV-EBMs)
11
05.1 – Latent Variable Energy Based Models (LV-EBMs), inference
12
05.2 – But what are these EBMs used for?
13
06L – Latent variable EBMs for structured prediction
14
06 – Latent Variable Energy Based Models (LV-EBMs), training
15
07L – PCA, AE, K-means, Gaussian mixture model, sparse coding, and intuitive VAE
16
07 – Unsupervised learning: autoencoding the targets
17
08L – Self-supervised learning and variational inference
18
08 – From LV-EBM to target prop to (vanilla, denoising, contractive, variational) autoencoder
19
09L – Differentiable associative memories, attention, and transformers
20
09 – AE, DAE, and VAE with PyTorch; generative adversarial networks (GAN) and code
21
10L – Self-supervised learning in computer vision
22
10 – Self / cross, hard / soft attention and the Transformer
23
11L – Speech recognition and Graph Transformer Networks
24
11 – Graph Convolutional Networks (GCNs)
25
12L – Low resource machine translation
26
12 – Planning and control
27
13L – Optimisation for Deep Learning
28
13 – The Truck Backer-Upper
29
14L – Lagrangian backpropagation, final project winners, and Q&A session
30
14 – Prediction and Planning Under Uncertainty
31
AI2S Xmas Seminar - Dr. Alfredo Canziani (NYU) - Energy-Based Self-Supervised Learning
Description:
Dive into the world of deep learning with this comprehensive NYU course. Explore the history and foundations of neural networks, including gradient descent and backpropagation. Master essential concepts like convolutional and recurrent neural networks, and gain hands-on experience with PyTorch implementations. Delve into advanced topics such as energy-based models, self-supervised learning, and variational inference. Discover the applications of deep learning in computer vision, speech recognition, and natural language processing. Learn about graph convolutional networks, transformers, and attention mechanisms. Tackle optimization techniques for deep learning and explore planning and control under uncertainty. Conclude with insights into Lagrangian backpropagation and participate in a Q&A session. This course offers a thorough understanding of deep learning principles and their practical applications in various domains.

NYU Deep Learning

Add to list
0:00 / 0:00