Vanation: Many Languages, One Parser (Ammar et al., TACL 2016)
13
Stack LSTM MALOPA
14
Tiny Target Treebank: Results
15
Zero Target Treebank: Results
16
Variation: Add Semantics (Swayamcipta et al., CONLL 2016)
17
Linguistic Structure Example: Semantic Dependencies
18
Variation: RNN Grammars (Dyer et al., NAACL 2016, Kuncoro et al., EACL 2017)
19
Another Linguistic Structure Example: Phrase-Structure Tree
20
Better Dependency Parsers?
21
Tree & String Generation with a Stack
22
Additional Details
23
Conclusions
Description:
Explore continuous state machines and grammars for linguistic structure prediction in this lecture from the Simons Institute. Delve into dependency parsing, recurrent neural networks, and stack LSTM parsers. Learn about global and greedy parsing paradigms, token and tree representations, and multilingual parsing approaches. Examine variations like adding semantics to parsers and using RNN grammars. Investigate phrase-structure trees and techniques for tree and string generation. Gain insights into cutting-edge research in natural language processing and computational linguistics through practical examples and experimental results presented by Noah Smith from the University of Washington.
Continuous State Machines and Grammars for Linguistic Structure Prediction