Главная
Study mode:
on
1
Introduction
2
Evaluation
3
Language Models
4
Feature Eyes Models
5
Example
6
Converting Scores to Probabilities
7
Computation Graph
8
Lookup
9
Loss Function
10
Perimeter Update
11
Unknown Words
12
Vocabulary
13
Unfair Advantage
14
Problems of Previous Model
15
Neural Language Models
16
Code
17
InputOutput Embedding
18
Training Tricks
19
Learning Rate Decay
Description:
Explore a comprehensive lecture on neural networks for natural language processing, focusing on predicting the next word in a sentence. Delve into topics such as describing words by their context, counting and prediction techniques, skip-grams and Continuous Bag of Words (CBOW) models, and methods for evaluating and visualizing word vectors. Learn about advanced techniques for word vector creation and gain practical insights through provided slides and code examples. This lecture, part of CMU's CS 11-747 course, offers a deep dive into the fundamentals of word embeddings and their applications in NLP tasks.

Neural Nets for NLP 2017 - A Simple Exercise - Predicting the Next Word in a Sentence

Graham Neubig
Add to list
0:00 / 0:00