Главная
Study mode:
on
1
Intro
2
Problems in Speech and Natural Language
3
The First (Statistical) Revolution
4
The Second (Neural) Revolution
5
A Personal View: the Parsing Problem
6
Kernel Methods
7
Word Embeddings
8
Natural Language Syntax, and the Parsing Problem
9
Shift Actions
10
Predicting Actions
11
The Natural Questions Data
12
Transformers (continued)
13
Multi-Head Transformers
14
This Talk: Three Problems, Three Architectures
Description:
Explore the evolution and challenges of neural models in speech and language processing through this insightful lecture by Michael Collins from Google Research and Columbia University. Delve into the statistical and neural revolutions in natural language processing, examining key concepts such as kernel methods, word embeddings, and parsing problems. Learn about innovative architectures like Transformers and Multi-Head Transformers, and their applications in solving complex language tasks. Gain a comprehensive understanding of three significant problems in the field and the corresponding architectures designed to address them.

Successes and Challenges in Neural Models for Speech and Language - Michael Collins

Institute for Advanced Study
Add to list