Главная
Study mode:
on
1
Introduction
2
Training Data Requirements
3
Representation Learning
4
Word Tyvek
5
Simple Vector Arithmetic
6
Deep Learning
7
Deep Learning Problems
8
Small Data Problems
9
Transfer Learning
10
Transfer Learning Diagram
11
Practical Recommendations
12
Quality of Embedding
13
Source Tasks
14
Logistic Regression
15
Second Order Optimization
16
Measuring Variance
17
Class Balance
18
Feature Engineering
19
Enzo
20
Workflow
21
Visualization
22
Documentation
23
Machine Learning Research
24
Good Papers
25
Deep contextualized word representations
26
Source model
27
Average Representations
28
Data Problems
29
Questions
Description:
Explore effective transfer learning techniques for Natural Language Processing in this 44-minute conference talk from ODSC East 2018. Dive into the challenges and opportunities of applying transfer learning to NLP tasks, comparing its success in computer vision to its limited gains in language processing. Learn about innovative approaches using sequence representations instead of fixed-length document vectors, and discover how these methods can improve performance on real-world tasks with limited training data. Gain insights into parameter and data-efficient mechanisms for transfer learning, and get introduced to Enso, an open-source library for benchmarking transfer learning methods. Understand practical recommendations for implementing transfer learning in NLP, including the importance of quality embeddings, source tasks, and feature engineering. Examine the workflow, visualization, and documentation aspects of machine learning research, and explore cutting-edge concepts like deep contextualized word representations. Read more

Effective Transfer Learning for NLP - Madison May - ODSC East 2018

Open Data Science
Add to list