Главная
Study mode:
on
1
Intro
2
Machine Learning vs Deep Learning
3
Mixing Mixing Rate Problems
4
MCMC Games
5
Given Time
6
Big N Problems
7
Subsets
8
Stochastic approximation
9
Sparse linear program
10
Theoretical guarantees
11
Biometrika
12
MCMC
13
Logistic Regression
14
Gaussian Process Models
15
Popular Algorithms
16
Why Algorithms Fail
17
Are You Changing the Way
Description:
Explore advanced techniques for scaling Bayesian inference in the context of big and complex data in this 47-minute lecture by David Dunson from Duke University. Delve into the computational challenges in machine learning, comparing traditional approaches with deep learning methods. Examine mixing rate problems, MCMC games, and strategies for handling large datasets. Learn about stochastic approximation, sparse linear programming, and theoretical guarantees in Bayesian inference. Investigate popular algorithms for logistic regression and Gaussian process models, understanding their limitations and potential improvements. Gain insights into why certain algorithms fail and consider innovative approaches to overcome these challenges in the field of machine learning and Bayesian statistics.

Scaling Up Bayesian Inference for Big and Complex Data

Simons Institute
Add to list
0:00 / 0:00