Главная
Study mode:
on
1
– Welcome to class
2
– Training of an EBM
3
– Contrastive vs. regularised / architectural methods
4
– General margin loss
5
– List of loss functions
6
– Generalised additive margin loss
7
– Joint embedding architectures
8
– Wav2Vec 2.0
9
– XLSR: multilingual speech recognition
10
– Generative adversarial networks GANs
11
– Mode collapse
12
– Non-contrastive methods
13
– BYOL: bootstrap your own latent
14
– SwAV
15
– Barlow twins
16
– SEER
17
– Latent variable models in practice
18
– DETR
19
– Structured prediction
20
– Factor graph
21
– Viterbi algorithm whiteboard time
22
– Graph transformer networks
23
– Graph composition, transducers
24
– Final remarks
Description:
Explore an in-depth lecture on latent variable Energy-Based Models (EBMs) for structured prediction, delivered by renowned AI researcher Yann LeCun. Dive into various topics including training EBMs, contrastive and regularized methods, general margin loss, joint embedding architectures, and generative adversarial networks (GANs). Learn about cutting-edge techniques such as Wav2Vec 2.0, XLSR for multilingual speech recognition, and non-contrastive methods like BYOL and SwAV. Discover practical applications of latent variable models, including DETR for object detection. Gain insights into structured prediction, factor graphs, and the Viterbi algorithm through detailed explanations and whiteboard demonstrations. Conclude with an exploration of graph transformer networks and their compositions, providing a comprehensive overview of advanced machine learning concepts and techniques.

Latent Variable EBMs for Structured Prediction

Alfredo Canziani
Add to list
0:00 / 0:00