Главная
Study mode:
on
1
Intro
2
Semantic Web and Other Uses
3
Why GPL?
4
How GPL Works
5
Query Generation
6
CORD-19 Dataset and Download
7
Query Generation Code
8
Query Generation is Not Perfect
9
Negative Mining
10
Negative Mining Implementation
11
Negative Mining Code
12
Pseudo-Labeling
13
Pseudo-Labeling Code
14
Importance of Pseudo-Labeling
15
Margin MSE Loss
16
MarginMSE Fine-tune Code
17
Choosing Number of Steps
18
Fast Evaluation
19
What's Next for Sentence Transformers?
Description:
Dive deep into Generative Pseudo-Labeling (GPL) and its potential impact on sentence transformers in this comprehensive video tutorial. Explore the challenges of training sentence transformers and how GPL offers a promising solution for fine-tuning high-performance bi-encoder models using unlabeled text data. Learn about the core concepts of GPL, including query generation, negative mining, and pseudo-labeling, with practical code examples using the CORD-19 dataset. Discover the importance of these techniques in building intelligent language models capable of understanding and responding to natural language queries. Gain insights into the implementation of GPL, including the use of Margin MSE Loss and fine-tuning strategies. Conclude with a discussion on the future of sentence transformers and the potential applications of GPL across various industries.

Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive

James Briggs
Add to list