Главная
Study mode:
on
1
Introduction
2
Outline
3
Divergence measures
4
The critic function
5
Variational form
6
Lower bound
7
Topological properties
8
Disadvantages of kl
9
Generalized energybased models
10
The generator
11
Generalised energybased models
12
Generalised likelihood
13
Graphical example
14
Energy function
15
Sampling
16
Realistic
17
Neural net divergence
18
How close is Q to P
19
Will I hit P
20
Smoothness properties
21
Jumping from mode to mode
22
I was happy to see it go from mode to mode
23
Risk of memorization
24
Generalized energybased model
25
Generalized likelihood
26
Multimodality
27
Smooth functions
28
The kernel beer
29
Mark
Description:
Explore the critic function of implicit generative models in this comprehensive seminar on Theoretical Machine Learning. Delve into divergence measures, variational forms, and topological properties as Arthur Gretton from University College London discusses generalized energy-based models and their applications. Examine the advantages and disadvantages of various approaches, including neural net divergence and generalized likelihood. Gain insights into smoothness properties, multimodality, and the challenges of jumping between modes in generative models. Understand the risks of memorization and the importance of realistic sampling in machine learning applications.

On the Critic Function of Implicit Generative Models - Arthur Gretton

Institute for Advanced Study
Add to list
00:00
-01:19