Главная
Study mode:
on
1
Intro
2
Language Models • Language models are generative models of text
3
Conditioned Language Models
4
Conditional Language Models
5
One Type of Conditional Language Model Sutskever et al. 2014
6
The Generation Problem
7
Ancestral Sampling
8
Greedy Search
9
Ensembling Combine predictions from multiple models
10
Log-linear Interpolation Weighted combination of log probabilities, normalize
11
Linear or Log Linear?
12
Parameter Averaging
13
Stacking
14
Basic Evaluation Paradigm
15
Human Evaluation
16
Perplexity
Description:
Explore a comprehensive lecture on conditioned generation in neural networks for natural language processing. Delve into encoder-decoder models, conditional generation techniques, and search algorithms. Learn about ensembling methods, evaluation strategies, and various types of data used for conditioning. Access accompanying slides and code examples for hands-on learning. Gain insights into language models, the generation problem, and evaluation paradigms, including human evaluation and perplexity. Part of CMU's Neural Networks for NLP course, this lecture provides essential knowledge for understanding and implementing advanced NLP techniques.

Neural Nets for NLP 2017 - Conditioned Generation

Graham Neubig
Add to list
0:00 / 0:00