Главная
Study mode:
on
1
Language Models Language models are generative models of text
2
Conditioned Language Models
3
Conditional Language Models
4
One Type of Conditional Language Model Sutskever et al. 2014
5
How to Pass Hidden State?
6
The Generation Problem
7
Ancestral Sampling
8
Greedy Search
9
Beam Search
10
Log-linear Interpolation • Weighted combination of log probabilities, normalize
11
Linear or Log Linear?
12
Stacking
13
Basic Evaluation Paradigm
14
Human Evaluation
15
Perplexity
Description:
Explore conditioned generation in neural networks for natural language processing through this comprehensive lecture from CMU's CS 11-747 course. Delve into encoder-decoder models, conditional generation techniques, and search algorithms. Examine ensembling methods, evaluation metrics, and various types of data used for conditioning. Learn about language models, including conditional and generative variants, and their applications. Understand the generation problem, sampling methods, and search strategies like greedy and beam search. Investigate log-linear interpolation, stacking, and evaluation paradigms including human evaluation and perplexity. Gain insights into the intricacies of passing hidden states and the differences between linear and log-linear approaches in neural network architectures for NLP tasks.

Neural Nets for NLP 2020: Conditioned Generation

Graham Neubig
Add to list