Главная
Study mode:
on
1
Intro
2
A First Try: Bag of Words (BOW)
3
Continuous Bag of Words (CBOW) this movie
4
What do Our Vectors Represent?
5
Bag of n-grams hate
6
Why Bag of n-grams?
7
2-dimensional Convolutional Networks
8
CNNs for Sentence Modeling
9
Standard conv2d Function
10
Padding
11
Striding
12
Pooling . Pooling is like convolution, but calculates some reduction function feature-wise • Max pooling: "Did you see this feature anywhere in the range?" (most common) • Average pooling: How preval…
13
Stacked Convolution
14
Dilated Convolution (e.g. Kalchbrenner et al. 2016) . Gradually increase stride every time step (na reduction in length) sentence
15
Iterated Dilated Convolution (Strubell+2017) . Multiple iterations of the same stack of dilated convolutions
16
Non-linear Functions
17
Which Non-linearity Should I Use?
Description:
Explore convolutional neural networks for natural language processing in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into bag of words and n-grams concepts before examining various applications of convolution in language modeling. Learn about context windows, sentence modeling, and structured convolution techniques. Discover the power of stacked and dilated convolutions in processing linguistic data. Investigate convolutional models for analyzing sentence pairs. Gain insights into pooling strategies, non-linear activation functions, and their impact on language processing tasks. Enhance your understanding of advanced CNN architectures tailored for NLP applications through detailed explanations and practical examples.

Neural Nets for NLP 2019 - Convolutional Neural Networks for Language

Graham Neubig
Add to list