Prediction-based Pruning Methods (e.g. Stern et al. 2017)
5
Backtracking-based Pruning Methods
6
What beam size should use?
7
Variable length output sequences . In many tasks (eg MT), the output sequences will be of variable length
8
More complicated normalization Google's Neural Machine Translation System Bridging the Gap
9
Predict the output length (Eriguchi et al. 2016)
10
Why do Bigger Beams Hurt, pt. 2
11
Dealing with disparity in actions Ellective Inference for Generative Neural Parsing Mitchell Stam et al., 2017
12
Solution
13
Improving Diversity in top N Choices
14
Improving Diversity through Sampling
15
Sampling without Replacement (con't)
16
Monte-Carlo Tree Search Human-like Natural Language Generation Using Monte Carlo Tree Search
17
More beam search in training A Continuous Relaxation of Bear Search for End-to-end Training of Neural Sequence Models (Goyal et al., 2017)
18
Adoption with neural networks: CCG Parsing
19
Is the heuristic admissible? (Lee et al. 2016)
20
Estimating future costs Li et al., 2017
21
Actor Critic (Bahdanau et. al., 2017)
22
Actor Critic (continued)
23
A* search: benefits and drawbacks
24
Particle Filters (Buys et al., 2015)
25
Reranking (Dyer et al. 2016)
Description:
Explore advanced search algorithms for natural language processing in this lecture from CMU's Neural Networks for NLP course. Dive into beam search, A* search, and search with future costs. Learn about pruning methods, variable length output sequences, and techniques for improving diversity in top choices. Examine the challenges of larger beam sizes and strategies to address them. Discover applications in machine translation, generative parsing, and CCG parsing. Investigate methods like Monte Carlo Tree Search, actor-critic models, and particle filters for enhancing search performance. Gain insights into reranking techniques and the benefits and drawbacks of various search algorithms in NLP tasks.
Neural Nets for NLP 2019 - Advanced Search Algorithms