Главная
Study mode:
on
1
Introduction
2
The beginning of HackingFace
3
GPT
4
Why Transfer Learning in NLP
5
Open Source Models
6
Birds in Python
7
Python Pretrained Bird
8
Transformers
9
Modern Architecture
10
Data Sets
11
Community Event
12
Data Hub
13
Open Science Big Science
14
Scaling
15
Reproduction
16
Problems
17
Supercomputers
18
Big Science Papers
19
Whats next
20
Questions
Description:
Discover the origins of HuggingFace and the Transformers library in this insightful 43-minute talk by Thomas Wolf, co-founder and Chief Science Officer of HuggingFace. Explore the concept of transfer learning in Natural Language Processing (NLP) and its significance in the field. Learn about open-source models, modern architectures, and the importance of data sets in AI development. Gain insights into community-driven initiatives, the Data Hub, and the Open Science Big Science project. Delve into scaling challenges, reproduction issues, and the role of supercomputers in AI research. Understand the impact of Big Science papers and get a glimpse of future developments in the field. Engage with thought-provoking questions and answers to deepen your understanding of HuggingFace's contributions to the AI and NLP landscape.

Transfer Learning in NLP and the Birth of the Transformers Library

HuggingFace
Add to list
0:00 / 0:00