Build a 2D convolutional neural network, part 1: Getting started
5
Build a 2D convolutional neural network, part 2: Overview
6
Build a 2D convolutional neural network, part 3: MNIST digits
7
Build a 2D convolutional neural network, part 4: Model overview
8
Build a 2D convolutional neural network, part 5: Pre-trained model results
9
Build a 2D convolutional neural network, part 6: Examples of successes and failures
10
Build a 2D convolutional neural network, part 7: Why Cottonwood?
11
Build a 2D convolutional neural network, part 8: Training code setup
12
Build a 2D convolutional neural network, part 9: Adding layers
13
Build a 2D convolutional neural network, part 10: Connecting layers
14
Build a 2D convolutional neural network, part 11: The training loop
15
Build a 2D convolutional neural network, part 12: Testing loop
16
Build a 2D convolutional neural network, part 13: Loss history and text summary
17
Build a 2D convolutional neural network, part 14: Collecting examples
18
Build a 2D convolutional neural network, part 15: Rendering examples
19
Build a 2D convolutional neural network, part 16: Cottonwood code tour
20
Build a 2D convolutional neural network, part 17: Cottonwood cheatsheet
Description:
Dive into a comprehensive 2.5-hour tutorial on two-dimensional convolution and convolutional neural networks. Learn the fundamentals of convolution operations, explore the Softmax neural network layer, and understand batch normalization. Build a 2D convolutional neural network from scratch, starting with the basics and progressing through a step-by-step implementation using the MNIST digits dataset. Gain insights into model architecture, pre-trained model results, and analyze examples of successes and failures. Discover the advantages of using Cottonwood framework, set up training code, add and connect layers, and implement training and testing loops. Explore loss history visualization, text summaries, and learn how to collect and render examples. Conclude with a Cottonwood code tour and a handy cheatsheet for quick reference.