Главная
Study mode:
on
1
intro
2
starter code walkthrough
3
let’s fix the learning rate plot
4
pytorchifying our code: layers, containers, torch.nn, fun bugs
5
overview: WaveNet
6
dataset bump the context size to 8
7
re-running baseline code on block_size 8
8
implementing WaveNet
9
training the WaveNet: first pass
10
fixing batchnorm1d bug
11
re-training WaveNet with bug fix
12
scaling up our WaveNet
13
experimental harness
14
WaveNet but with “dilated causal convolutions”
15
torch.nn
16
the development process of building deep neural nets
17
going forward
18
improve on my loss! how far can we improve a WaveNet on this data?
Description:
Dive into an in-depth tutorial on building a WaveNet-like convolutional neural network architecture, expanding upon a 2-layer MLP from previous lessons. Explore the process of deepening the network with a tree-like structure, mirroring the WaveNet (2016) architecture from DeepMind. Gain valuable insights into torch.nn, its inner workings, and typical deep learning development processes. Follow along as the instructor navigates through documentation, manages multidimensional tensor shapes, and transitions between Jupyter notebooks and repository code. Learn how to implement and train the WaveNet model, address common bugs, and scale up the architecture. Discover the concept of dilated causal convolutions and their efficient implementation in deep learning models. Conclude with an experimental harness, discussions on improving model performance, and future directions for enhancing WaveNet on the given dataset.

Building Makemore - Building a WaveNet

Andrej Karpathy
Add to list
0:00 / 0:00