Главная
Study mode:
on
1
Some NLP Tasks we've Handled
2
Some Connections to Tasks over Documents
3
Document Level Language Modeling
4
Remember: Modeling using Recurrent Networks
5
Separate Encoding for Coarse- grained Document Context (Mikolov & Zweig 2012)
6
What Context to Incorporate?
7
Self-attention/Transformers Across Sentences
8
Document Problems: Entity Coreference
9
Mention(Noun Phrase) Detection
10
Components of a Coreference Model
11
Coreference Models:Instances
12
Mention Pair Models
13
Entity Models: Entity-Mention Models
14
Advantages of Neural Network Models for Coreference
15
Coreference Resolution w/ Entity- Level Distributed Representations
16
Deep Reinforcement Learning for Mention-Ranking Coreference Models
17
End-to-End Neural Coreference (Span Model)
18
End-to-End Neural Coreference (Coreference Model)
19
Using Coreference in Neural Models
20
Discourse Parsing w/ Attention- based Hierarchical Neural Networks
21
Uses of Discourse Structure
Description:
Learn about document-level natural language processing in this lecture from CMU's Neural Networks for NLP course. Explore document-level language modeling techniques, including recurrent networks and self-attention mechanisms. Dive into entity coreference resolution, covering mention detection, mention pair models, and entity-level distributed representations. Examine discourse parsing using attention-based hierarchical neural networks. Gain insights into applying these advanced NLP concepts to tasks spanning multiple sentences and entire documents.

Neural Nets for NLP 2020 - Document Level Models

Graham Neubig
Add to list