Главная
Study mode:
on
1
Introduction
2
Neural Networks
3
Goals
4
Multitasking learning
5
Level of variety
6
Multitasking
7
Related Tasks
8
Multitask Learning
9
Pretraining
10
Pretraining Methods
11
Sentence Representations
12
Sentence Pair Classification
13
Sentence Pair Classification Examples
14
Semantic Similarity Relatedness
15
Textual entailment
16
Methods
17
Autoencoder
18
Skip thought vectors
19
Paraphrasebased contrastive learning
20
Largescale paraphrasing
21
Multitasking entailment
22
Supervised training
23
Sentence transformers
24
Context effect
25
Masking
Description:
Explore advanced natural language processing techniques in this comprehensive lecture on pre-training methods. Delve into multi-task learning concepts, sentence embeddings, BERT and its variants, and alternative language modeling objectives. Gain insights into sentence representations, semantic similarity, textual entailment, and various pretraining approaches including autoencoders, skip-thought vectors, and paraphrase-based contrastive learning. Examine the impact of context and masking in language models, and understand the applications of these techniques in real-world NLP tasks.

CMU Advanced NLP: Pre-training Methods

Graham Neubig
Add to list
0:00 / 0:00