Главная
Study mode:
on
1
Introduction
2
NLP Tasks
3
Modeling Long Sequences
4
Separate Encoding
5
Selfattention Transformers
6
Transformer XL
7
Compressive Transformers
8
Sparse Transformers
9
Adaptive Span Transformers
10
Sparse Span Transformers
11
Reformer Model
12
Low Rank Approximation
13
Sparse Attention
14
Evaluation
15
Other Methods
16
Questions
17
Components of Coreference Models
18
Mention Pair Models
19
Model
Description:
Explore advanced techniques for modeling long sequences in natural language processing through this comprehensive lecture from CMU's Advanced NLP course. Delve into extracting features from extended text and tackling document processing tasks. Learn about various transformer architectures including Transformer XL, Compressive Transformers, and Sparse Transformers. Examine adaptive span and sparse span approaches, as well as the Reformer model. Investigate low rank approximation and sparse attention methods. Gain insights into evaluation techniques and other relevant methodologies. Conclude with an overview of coreference models, including mention pair models and their components.

Advanced NLP 2022: Modeling Long Sequences

Graham Neubig
Add to list