Главная
Study mode:
on
1
Why Language Embedding Matters
2
Supervised Methods
3
Natural Language Inference
4
Semantic Textual Similarity
5
Multilingual Training
6
TSDAE Unsupervised
7
Data Preparation
8
Initialize Model
9
Model Training
10
NLTK Error
11
Evaluation
12
TSDAE vs Supervised Methods
13
Why TSDAE is Cool
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Explore the world of unsupervised sentence transformers in this comprehensive 44-minute video tutorial. Dive into the challenges of adapting pretrained transformers for meaningful sentence vector production, especially in domains and languages with limited labeled data. Learn about the Transformer-based Sequential Denoising Auto-Encoder (TSDAE) approach as an alternative to supervised fine-tuning methods. Discover the process of data preparation, model initialization, training, and evaluation for TSDAE. Compare the effectiveness of TSDAE with supervised methods and understand its advantages in scenarios with scarce labeled data. Gain insights into language embedding importance, supervised techniques like Natural Language Inference and Semantic Textual Similarity, and the potential of multilingual training.

Today Unsupervised Sentence Transformers, Tomorrow Skynet - How TSDAE Works

James Briggs
Add to list