Главная
Study mode:
on
1
Intro
2
Probabilistic models for graphs
3
Sequence of graphs
4
The Old Way: Nodes
5
The Old Way: Exchangeability
6
The Old Way: Node exchangeability
7
Aldous-Hoover
8
A New Way: Edges
9
Edge exchangeability
10
Exchangeable probability functions
11
Feature allocation is exchangeable if it has a feature paintbox representation
12
Edge-exchangeable graph
13
Cor (CCB). A graph sequence is edge- exchangeable iff it has a graph paintbox
14
How to prove sparsity?
15
What we know so far
16
Nonparametric Bayes
Description:
Explore nonparametric Bayesian methods in this lecture from the Foundations of Machine Learning Boot Camp. Delve into probabilistic models for graphs, examining both traditional node-based approaches and newer edge-based techniques. Learn about exchangeability in various contexts, including node exchangeability, edge exchangeability, and the Aldous-Hoover theorem. Discover the concept of graph paintboxes and their role in edge-exchangeable graphs. Investigate feature allocation and its connection to exchangeability through feature paintbox representations. Gain insights into proving sparsity in graph sequences and understand the current state of knowledge in nonparametric Bayesian methods for graph modeling.

Nonparametric Bayesian Methods - Models, Algorithms, and Applications IV

Simons Institute
Add to list
0:00 / 0:00