Главная
Study mode:
on
1
Introduction
2
What is a generative model
3
Diffusion process
4
Continuoustime neural nets
5
Questions
6
Coming up
7
Exact Sampling
8
Parabolic PDEs
9
Optimal Control
10
Schroedinger Bridge Problem
11
Variational Inference
12
Nonparametric Sampling
13
Proof
14
Empirical Process Techniques
Description:
Explore the diffusion limit of deep generative models in this comprehensive lecture. Delve into the unified perspective on sampling and variational inference through stochastic control. Learn how to quantify the expressiveness of diffusion-based generative models and discover efficient sampling techniques for a wide class of terminal target distributions. Examine the proof of sampling accuracy using Kullback-Leibler divergence and investigate an unbiased, finite-variance simulation scheme implementable as a deep generative model with a random number of layers. Cover topics such as continuous-time neural nets, parabolic PDEs, optimal control, the Schrödinger Bridge Problem, and nonparametric sampling, while gaining insights into empirical process techniques.

Neural SDEs - Deep Generative Models in the Diffusion Limit - Maxim Raginsky

Institute for Advanced Study
Add to list
0:00 / 0:00