Главная
Study mode:
on
1
Intro
2
Trickier cases
3
ConvNets match pieces of the image
4
Filtering: The math behind the match
5
Convolution: Trying every possible match
6
Pooling
7
Rectified Linear Units (ReLUS)
8
Fully connected layer
9
Input vector
10
A neuron
11
Squash the result
12
Weighted sum-and-squash neuron
13
Receptive fields get more complex
14
Add an output layer
15
Exhaustive search
16
Gradient descent with curvature
17
Tea drinking temperature
18
Chaining
19
Backpropagation challenge: weights
20
Backpropagation challenge: sums
21
Backpropagation challenge: sigmoid
22
Backpropagation challenge: ReLU
23
Training from scratch
24
Customer data
Description:
Dive deep into the inner workings of convolutional neural networks in this comprehensive one-hour lecture. Explore the fundamental concepts, including filtering, convolution, pooling, and rectified linear units (ReLUs). Learn how ConvNets match pieces of images and understand the mathematics behind the matching process. Discover the role of fully connected layers, input vectors, and neurons in neural networks. Examine the complexities of receptive fields and output layers. Gain insights into training techniques such as gradient descent, backpropagation, and training from scratch. Understand how these concepts apply to real-world scenarios like customer data analysis and tea drinking temperature prediction.

How Convolutional Neural Networks Work, in Depth

Brandon Rohrer
Add to list
0:00 / 0:00