Главная
Study mode:
on
1
Intro
2
Outline
3
An Example Prediction Problem: Sentiment Classification
4
Continuous Bag of Words (CBOW)
5
Deep CBOW
6
Why Bag of n-grams?
7
What Problems
8
Neural Sequence Models
9
Definition of Convolution
10
Intuitive Understanding
11
Priori Entailed by CNNS
12
Concept: 2d Convolution
13
Concept: Stride
14
Concept: Padding
15
Three Types of Convolutions
16
Concept: Multiple Filters
17
Concept: Pooling
18
Overview of the Architecture
19
Embedding Layer
20
Conv. Layer
21
Pooling Layer
22
Output Layer
23
Dynamic Filter CNN (e.g. Brabandere et al. 2016)
24
CNN Applications
25
NLP (Almost) from Scratch (Collobert et al. 2011)
26
CNN-RNN-CRF for Tagging (Ma et al. 2016) . A classic framework and de-facto standard for
27
Why Structured Convolution?
28
Understand the design philosophy of a model
29
Structural Bias
30
component entail?
Description:
Explore convolutional neural networks for text processing in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into bag-of-words models, convolution applications for context windows and sentence modeling, advanced techniques like stacked and dilated convolutions, and structured convolution. Examine various applications of convolutional models in natural language processing and consider the inductive biases they introduce. Learn about sentiment classification, continuous bag of words, deep CBOW, and neural sequence models. Understand key concepts such as 2D convolution, stride, padding, multiple filters, and pooling. Discover the architecture of CNN models for NLP tasks, including embedding layers, convolutional layers, pooling layers, and output layers. Investigate specific applications like dynamic filter CNNs, NLP from scratch, and CNN-RNN-CRF models for tagging. Gain insights into the design philosophy and structural biases of convolutional models for text processing. Read more

Neural Nets for NLP 2020 - Convolutional Neural Networks for Text

Graham Neubig
Add to list