Главная
Study mode:
on
1
Many languages are left behind
2
Roadmap
3
Cross-lingual transfer
4
Supporting multiple languages could be tedious
5
Combining the two methods
6
Use case: covid-19 response
7
Rapid adaptation of massive multilingual models
8
Meta-learning for multilingual training
9
Multilingual NMT
10
Improve zero-shot NMT
11
Align multilingual representation
12
Zero-shot transfer for pretrained representations
13
Massively multilingual training
14
Training data highly imbalanced
15
Heuristic Sampling of Data
16
Learning to balance data
17
Problem: sometimes underperforms bilingual model
18
Multilingual Knowledge Distillation
19
Adding Language-specific layers
20
Problem: one-to-many transfer
21
Problem: multilingual
22
evaluation
23
Discussion question
Description:
Explore methods for training multilingual systems, zero-shot adaptation, and open problems in multilingual learning in this 40-minute lecture from CMU's CS11-737 "Multilingual Natural Language Processing" course. Delve into topics such as cross-lingual transfer, rapid adaptation of massive multilingual models, meta-learning for multilingual training, and improving zero-shot neural machine translation. Examine challenges like training data imbalance, underperformance of multilingual models compared to bilingual ones, and issues with one-to-many transfer. Learn about techniques including heuristic sampling of data, multilingual knowledge distillation, and adding language-specific layers. Gain insights into the complexities of supporting multiple languages and addressing the needs of underrepresented languages in NLP.

CMU Multilingual NLP 2020 - Multilingual Training and Cross-Lingual Transfer

Graham Neubig
Add to list