Главная
Study mode:
on
1
Intro
2
Goal for Today
3
Where would we need/use Sentence Representations?
4
Sentence Classification
5
Paraphrase Identification (Dolan and Brockett 2005) • Identify whether A and B mean the same thing
6
Textual Entailment (Dagan et al. 2006, Marelli et al. 2014)
7
Model for Sentence Pair Processing
8
Types of Learning
9
Plethora of Tasks in NLP
10
Rule of Thumb 2
11
Standard Multi-task Learning
12
Thinking about Multi-tasking, and Pre-trained Representations
13
General Model Overview
14
Language Model Transfer
15
End-to-end vs. Pre-training
16
Context Prediction Transfer (Skip-thought Vectors) (Kiros et al. 2015)
17
Paraphrase ID Transfer (Wieting et al. 2015)
18
Large Scale Paraphrase Data (ParaNMT-50MT) (Wieting and Gimpel 2018)
19
Entailment Transfer (InferSent) (Conneau et al. 2017)
20
Bi-directional Language Modeling Objective (ELMO)
21
Masked Word Prediction (BERT)
Description:
Explore sentence and contextualized word representations in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into sentence representations, contextual word representations, and their applications in various NLP tasks. Learn about sentence classification, paraphrase identification, and textual entailment. Discover different types of learning, multi-task learning, and pre-trained representations. Examine language model transfer, context prediction transfer, and entailment transfer. Gain insights into bi-directional language modeling objectives and masked word prediction techniques. Enhance your understanding of advanced NLP concepts and their practical implementations in this in-depth presentation by Graham Neubig.

Neural Nets for NLP 2019: Sentence and Contextualized Word Representations

Graham Neubig
Add to list
0:00 / 0:00