Explore effective transfer learning techniques for Natural Language Processing in this 44-minute conference talk from ODSC East 2018. Dive into the challenges and opportunities of applying transfer learning to NLP tasks, comparing its success in computer vision to its limited gains in language processing. Learn about innovative approaches using sequence representations instead of fixed-length document vectors, and discover how these methods can improve performance on real-world tasks with limited training data. Gain insights into parameter and data-efficient mechanisms for transfer learning, and get introduced to Enso, an open-source library for benchmarking transfer learning methods. Understand practical recommendations for implementing transfer learning in NLP, including the importance of quality embeddings, source tasks, and feature engineering. Examine the workflow, visualization, and documentation aspects of machine learning research, and explore cutting-edge concepts like deep contextualized word representations.
Read more
Effective Transfer Learning for NLP - Madison May - ODSC East 2018