Главная
Study mode:
on
1
Introduction
2
Knowledge Distillation
3
Soft and Hard labels
4
Self-training and distillation/ Noisy student
5
Algorithm: Noisy student training
6
Noise effect
7
Pseudo labels
8
Architecture
9
Experimental results
10
Robustness results
Description:
Explore an innovative approach to improving ImageNet classification through self-training with Noisy Student in this 22-minute Launchpad video. Delve into key concepts such as knowledge distillation, soft and hard labels, and the interplay between self-training and distillation. Examine the Noisy Student training algorithm, understand the effects of noise, and learn about pseudo labels. Discover the architecture behind this method and analyze experimental results, including its impact on robustness. Gain valuable insights into this cutting-edge technique for enhancing image classification performance.

Self-Training With Noisy Student Improves ImageNet Classification

Launchpad
Add to list
0:00 / 0:00