Главная
Study mode:
on
1
Intro
2
Remember, Neural Nets are Feature Extractors!
3
Types of Learning
4
Plethora of Tasks in NLP
5
Rule of Thumb 1: Multitask to Increase Data
6
Rule of Thumb 2
7
Standard Multi-task Learning
8
Examples of Pre-training Encoders
9
Regularization for Pre-training (e.g. Barone et al. 2017)
10
Selective Parameter Adaptation
11
Soft Parameter Tying
12
Supervised/Unsupervised Adaptation
13
Supervised Domain Adaptation through Feature Augmentation
14
Unsupervised Learning through Feature Matching
15
Multilingual Inputs
16
Multilingual Structured Prediction/ Multilingual Outputs
17
Teacher-student Networks for Multilingual Adaptation (Chen et al. 2017)
18
Types of Multi-tasking
19
Multiple Annotation Standards
Description:
Explore multilingual and multitask learning in neural networks for natural language processing through this 52-minute lecture by Graham Neubig. Delve into key concepts such as multitask learning, domain adaptation, and multilingual learning. Gain insights on increasing data through multitask approaches, pre-training encoders, and regularization techniques. Examine supervised and unsupervised domain adaptation methods, multilingual inputs and outputs, and teacher-student networks for multilingual adaptation. Understand various types of multi-tasking and multiple annotation standards in NLP tasks. Access accompanying slides and related course materials for a comprehensive learning experience in advanced NLP techniques.

Neural Nets for NLP 2017 - Multilingual and Multitask Learning

Graham Neubig
Add to list