Главная
Study mode:
on
1
Intro
2
NLP and Sequential Data
3
Long-distance Dependencies in Language
4
Parameter Tying
5
What Can RNNs Do?
6
e.g. Language Modeling
7
Representing Sentences
8
Representing Contexts
9
Recurrent Neural Networks in DyNet
10
Parameter Initialization
11
Sentence Initialization
12
A Solution: Long Short-term Memory (Hochreiter and Schmichuber 1997)
13
Other Alternatives
14
Handling Mini-batching
15
Mini-batching Method
16
Handling Long Sequences
17
Example: LM - Sentence Classifier
18
LSTM Structure
Description:
Explore recurrent neural networks in this lecture from CMU's Neural Networks for NLP course. Dive into the fundamentals of recurrent networks, addressing challenges like vanishing gradients through LSTMs. Analyze the strengths and weaknesses of recurrence in sentence modeling and discover pre-training techniques for RNNs. Access accompanying slides and code examples to reinforce your understanding of key concepts including parameter tying, language modeling, sentence representation, and handling long sequences with mini-batching methods.

Neural Nets for NLP 2017 - Recurrent Neural Networks

Graham Neubig
Add to list