Implicit Generative Models Implicit models: directly represent the sampling process
4
Representation of Probability Distributions
5
Learning Deep Energy-Based Models using Scores
6
Learning with Sliced Score Matching
7
Experiments: Scalability and Speed
8
Experiments: Fitting Deep Kernel Exponential Families
9
From Score Estimation to Sample Generation
10
Pitfall 1: Manifold Hypothesis
11
Pitfall 2: Inaccurate Score Estimation in Low Data-Density Regions
12
Data Modes
13
Gaussian Perturbation
14
Annealed Langevin Dynamics
15
Joint Score Estimation
16
Experiments: Sampling
Description:
Explore a comprehensive seminar on generative modeling techniques focusing on estimating gradients of data distributions. Delve into implicit generative models, deep energy-based models, and score estimation methods presented by Stanford University's Stefano Ermon at the Institute for Advanced Study. Learn about the progress in generative models for text, representation of probability distributions, and scalable learning techniques using sliced score matching. Discover potential pitfalls in sample generation, including the manifold hypothesis and inaccurate score estimation in low data-density regions. Examine innovative solutions such as Gaussian perturbation and annealed Langevin dynamics. Gain insights into joint score estimation and practical experiments demonstrating the effectiveness of these approaches in various sampling scenarios.
Generative Modeling by Estimating Gradients of the Data Distribution - Stefano Ermon