An Example Prediction Problem: Sentence Classification
3
A First Try: Bag of Words (BOW)
4
Continuous Bag of Words (CBOW) movie
5
What do Our Vectors Represent?
6
Why Bag of n-grams?
7
What Problems w/ Bag of n-grams?
8
Time Delay Neural Networks (Waibel et al. 1989)
9
Convolutional Networks (LeCun et al. 1997)
10
Standard conv2d Function
11
Stacked Convolution
12
Dilated Convolution (e.g. Kalchbrenner et al. 2016)
13
An Aside: Nonlinear Functions • Proper choice of a non-linear function is essential in stacked networks
14
Why (Dilated) Convolution for Modeling Sentences? • In contrast to recurrent neural networks (next class)
15
Example: Dependency Structure
16
Why Model Sentence Pairs?
17
Siamese Network (Bromley et al. 1993)
Description:
Explore convolutional networks for text processing in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into bag-of-words models, n-grams, and convolution techniques. Learn about context windows, sentence modeling, and advanced concepts like stacked and dilated convolutions. Discover structured convolution methods and approaches for modeling sentence pairs. Gain insights into visualizing convolutional neural networks for text analysis. Access accompanying slides and code examples to reinforce your understanding of these powerful techniques in natural language processing.
Neural Nets for NLP 2017 - Convolutional Networks for Text