Problem: Embeddings May Not be Indicative of Syntax
9
Normalizing Flow (Rezende and Mohamed 2015)
10
Cross-lingual Application of Unsupervised Models (He et al. 2019)
11
Soft vs. Hard Tree Structure
12
One Other Paradigm: Weak Supervision
13
Gated Convolution (Cho et al. 2014)
14
Learning with RL (Yogatama et al. 2016)
15
Difficulties in Learning Latent Structure
Description:
Explore a comprehensive lecture on unsupervised and semi-supervised learning of structure in neural networks for natural language processing. Delve into the distinctions between learning features and learning structure, examine various semi-supervised and unsupervised learning methods, and understand key design decisions for unsupervised models. Gain insights into practical examples of unsupervised learning, including cross-lingual applications and the challenges of learning latent structure. Learn about advanced concepts such as normalizing flow, gated convolution, and reinforcement learning approaches in the context of NLP. Discover how to leverage weak supervision and navigate the complexities of soft vs. hard tree structures in neural network architectures.
Neural Nets for NLP 2020 - Unsupervised and Semi-supervised Learning of Structure