Главная
Study mode:
on
1
- Introduction
2
- Paper Review
3
- Attention Mechanism
4
- TransformerBlock
5
- Encoder
6
- DecoderBlock
7
- Decoder
8
- Putting it togethor to form The Transformer
9
- A Small Example
10
- Fixing Errors
11
- Ending
Description:
Dive into a comprehensive 57-minute video tutorial on implementing PyTorch Transformers from scratch, based on the groundbreaking "Attention is all you need" paper. Explore the original transformer architecture, starting with a detailed paper review and progressing through key components such as the attention mechanism, transformer blocks, encoder, and decoder. Learn how to assemble these elements to create a complete Transformer model, and gain practical insights through a small example and error-fixing session. Benefit from additional resources, including recommended courses and free materials, to further enhance your understanding of machine learning, deep learning, and natural language processing.

Pytorch Transformers from Scratch - Attention Is All You Need

Aladdin Persson
Add to list
0:00 / 0:00