Главная
Study mode:
on
1
Intro
2
What do we want to know about words?
3
A Manual Attempt: WordNet
4
An Answer (?): Word Embeddings!
5
Word Embeddings are Cool! (An Obligatory Slide)
6
How to Train Word Embeddings?
7
Distributional Representations (see Goldberg 10.4.1)
8
Count-based Methods
9
Word Embeddings from Language Models
10
Context Window Methods
11
Skip-gram (Mikolov et al. 2013) • Predict each word in the context given the word
12
Count-based and Prediction-based Methods
13
Glove (Pennington et al. 2014)
14
What Contexts?
15
Types of Evaluation
16
Extrinsic Evaluation: Using Word Embeddings in Systems
17
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
18
Limitations of Embeddings
19
Sub-word Embeddings (1)
20
Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)
Description:
Explore word vectors in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into techniques for describing words by their context, including counting and prediction methods like skip-grams and CBOW. Learn how to evaluate and visualize word vectors, and discover advanced methods for creating more nuanced word representations. Examine the limitations of traditional embeddings and explore solutions like sub-word and multi-prototype embeddings. Gain insights into both intrinsic and extrinsic evaluation methods for word embeddings, and understand their practical applications in NLP systems.

Neural Nets for NLP 2019 - Word Vectors

Graham Neubig
Add to list
00:00
-00:38