Главная
Study mode:
on
1
– Week 12 – Practicum
2
– Attention
3
– Key-value store
4
– Transformer and PyTorch implementation
5
– Q&A
Description:
Explore the intricacies of attention mechanisms and the Transformer architecture in this comprehensive 1-hour 18-minute lecture by Alfredo Canziani. Delve into the concept of self-attention and its role in creating hidden layer representations of inputs. Discover the key-value store paradigm and learn how to represent queries, keys, and values as rotations of an input. Gain insights into the Transformer architecture through a detailed walkthrough of a forward pass and compare the encoder-decoder paradigm with sequential architectures. The lecture concludes with a Q&A session, providing an opportunity to clarify complex concepts and deepen understanding of attention mechanisms and their implementation in PyTorch.

Attention and the Transformer

Alfredo Canziani
Add to list
0:00 / 0:00