Главная
Study mode:
on
1
Introduction
2
The Problem
3
Time Series
4
Word embeddings
5
Math trick
6
Attention
7
Attention Comparison
8
Transformer Layer
9
Recurrent Neural Networks
10
Transformer Embedding
11
Transformer Demo
12
Questions
Description:
Explore the inner workings of machine learning algorithms used in Rasa, focusing on the transformer's role in replacing RNNs for natural language processing and dialogue handling. This technical talk from EuroPython 2020 delves into why transformers have become integral to many algorithms. Witness a live demonstration comparing typical LSTM errors with transformer performance, and gain insights through clear diagrams with minimal mathematical complexity. Learn about time series, word embeddings, attention mechanisms, transformer layers, and recurrent neural networks. Understand the advantages of transformer embeddings and see their practical application in a demo. Conclude with a Q&A session to address any lingering questions about this powerful machine learning technique.

Why Transformers Work

EuroPython Conference
Add to list
0:00 / 0:00