Главная
Study mode:
on
1
Intro
2
Univariate time series
3
Multivariate time series
4
Intuition
5
RNN architecture
6
Unrolling a recurrent layer
7
Data shape
8
Sequence to vector RNN
9
Sequence to sequence RNN
10
Memory cell for simple RNN
11
Why do we use tanh?
12
Backpropagation through time (BPTT)
13
The math behind
14
Issues with simple RNNS
15
What's up next?
Description:
Learn about the inner workings of Recurrent Neural Networks (RNNs) in this comprehensive 29-minute video tutorial. Explore simple RNN units, time series analysis, and the intricacies of Back Propagation Through Time (BPTT). Dive into topics such as univariate and multivariate time series, RNN architecture, unrolling recurrent layers, and various RNN configurations including sequence-to-vector and sequence-to-sequence. Understand the role of memory cells in simple RNNs, the significance of the tanh activation function, and the mathematical foundations of BPTT. Gain insights into the challenges associated with simple RNNs and prepare for advanced concepts in neural network design.

Recurrent Neural Networks Explained Easily

Valerio Velardo - The Sound of AI
Add to list
0:00 / 0:00