Главная
Study mode:
on
1
Intro
2
NLI Training Data
3
Preprocessing
4
SBERT Finetuning Visuals
5
MNR Loss Visual
6
MNR in PyTorch
7
MNR in Sentence Transformers
8
Results
9
Outro
Description:
Explore the process of fine-tuning high-performance sentence transformers using Multiple Negatives Ranking (MNR) loss in this 37-minute video tutorial. Learn about the evolution of transformer-produced sentence embeddings, from BERT cross-encoders to SBERT and beyond. Discover how MNR loss has revolutionized the field, enabling newer models to quickly outperform their predecessors. Dive into the implementation of MNR loss for fine-tuning sentence transformers, covering both a detailed approach and a simplified method using the sentence-transformers library. Gain insights into NLI training data, preprocessing techniques, and visual representations of SBERT fine-tuning and MNR loss. Compare results and understand the impact of this advanced technique on sentence embedding quality.

Fine-Tune High Performance Sentence Transformers With Multiple Negatives Ranking

James Briggs
Add to list
0:00 / 0:00