Главная
Study mode:
on
1
Intro
2
Remember, Neural Nets are Feature Extractors!
3
Reminder: Types of Learning
4
Standard Multi-task Learning
5
Selective Parameter Adaptation • Sometimes it is better to adapt only some of the parameters
6
Different Layers for Different Tasks (Hashimoto et al. 2017)
7
Multiple Annotation Standards
8
Supervised/Unsupervised Adaptation
9
Supervised Domain Adaptation through Feature Augmentation
10
Unsupervised Learning through Feature Matching
11
Multi-lingual Sequence-to- sequence Models
12
Multi-lingual Pre-training
13
Difficulties in Fully Multi- lingual Learning
14
Data Balancing
15
Cross-lingual Transfer Learning
16
What if languages don't share the same script?
17
Zero-shot Transfer to New Languages
18
Data Creation, Active Learning . In order to get in-language training data, Active Learning (AL) can be used
Description:
Explore multitask and multilingual learning in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the fundamentals of multi-task learning, examining various methods and objectives specific to NLP. Investigate multilingual learning techniques, including multi-lingual sequence-to-sequence models and pre-training approaches. Learn about challenges in fully multi-lingual learning, such as data balancing and cross-lingual transfer. Discover strategies for handling languages with different scripts and implementing zero-shot transfer to new languages. Gain insights into data creation and active learning for obtaining in-language training data.

CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning

Graham Neubig
Add to list
0:00 / 0:00