Gibbs sampling (Geman et al. 1984) • De facto inference method for graphical models
15
Inference on Graphical Models
16
Poisson-Minibatching for Gibbs Sampling
17
Theoretically-Guaranteed Inference
18
Question 1
19
Why Deep Learning Needs Reliable Inference
20
Stochastic gradient MCMC
21
Improvements for SG-MCMC
22
Question 2 How do you efficiently explore the city? By car or on foot?
23
Problem Analysis
24
Our solution Cyclical stepsize schedule
25
CSG-MCMC Details Introduce a system temperaturer to control the sampler's behavior
26
Convergence Guarantees
27
Mixture of 25 Gaussians
28
Bayesian Neural Networks
29
ImageNet
30
Efficient Inference for Reliable Deep Learning
31
Push the Frontier
Description:
Explore scalable and reliable inference techniques for probabilistic modeling in this 42-minute lecture by Ruqi Zhang from UT Austin. Delve into the Metropolis-Hastings algorithm and learn how minibatching can enhance its scalability. Examine Poisson-Minibatching's guaranteed exactness and scalability through empirical verification on Gaussian mixtures and logistic regression. Investigate Gibbs sampling for graphical models and its Poisson-Minibatching adaptation. Discover why deep learning requires reliable inference and explore stochastic gradient MCMC improvements. Analyze the Cyclical Stepsize Schedule solution for efficient exploration, including convergence guarantees. Apply these concepts to mixture models, Bayesian neural networks, and ImageNet. Gain insights into pushing the frontier of efficient inference for reliable deep learning in this Joint IFML/CCSI Symposium talk from the Simons Institute.
Scalable and Reliable Inference for Probabilistic Modeling