Главная
Study mode:
on
1
Introduction
2
About the talk
3
How did it start
4
What happens when the customer is waited
5
Good old AI
6
Visionary
7
English
8
German
9
Product
10
Customer Feedback
11
Part of Speech Tagging
12
Architecture
13
Stanford NLP
14
Reinventing the wheel
15
What is BERT
16
What is supervised training
17
Unsupervised learning
18
Unsupervised training
19
Semisupervised training
20
Input structure
21
Transformer block
22
What is good
23
What is bad
24
How did we do it
25
BERT parameter tuning
26
Preprocessing
27
Word pieces
28
Morphemes vs morphs
29
Dictionary size
30
How does it look
31
What did we save
32
Training
33
Training Results
34
Future Plans
35
Backend
36
ZukaText
37
Whats next
Description:
Explore the journey of implementing BERT, a cutting-edge natural language processing model, in a real-world product development scenario. Dive into the challenges, successes, and lessons learned as a team transforms theoretical concepts into a functional natural language generation application. Learn about the decision-making process behind choosing BERT, alternative approaches considered, and the intricacies of training a custom version of the network. Gain valuable insights into common pitfalls to avoid and unexpected discoveries made during the implementation process. This conference talk provides a comprehensive look at bridging the gap between academic research and practical application in the field of NLP, offering both technical details and strategic considerations for professionals working with advanced language models.

From Paper to Product - How We Implemented BERT

MLCon | Machine Learning Conference
Add to list
0:00 / 0:00