Главная
Study mode:
on
1
Introduction
2
Background
3
Tenets
4
Overview
5
Representation
6
sufficiency
7
information bottleneck
8
the task
9
mutual information
10
cost functional
11
Representation of Past Data
12
Two Information bottlenecks
13
Results
14
Disentangling
15
Bias
16
Leibler divergence
17
Flat minima
18
Biasvariance tradeoff
19
Notation
20
Pocket Planck Equation
21
Limit Cycles
22
Eigen Values
23
Local Entropy
24
Local Entropy Solution
25
Standard Gaussian Relaxation
26
Where do we take this
27
What does the steering not cover
Description:
Explore the intricacies of representation learning in artificial intelligence through this 55-minute seminar by Stefano Soatto at New York University. Delve into the Information Knot Tying Sensing & Action and Emergence Theory of Representation Learning. Examine key concepts such as representation sufficiency, information bottleneck, mutual information, and cost functionals. Investigate the representation of past data, disentangling, bias-variance tradeoff, and local entropy solutions. Gain insights into flat minima, limit cycles, eigen values, and the Pocket Planck Equation. Analyze the implications of standard Gaussian relaxation and consider future directions in this field of study.

The Information Knot - Tying Sensing and Action; Emergence Theory of Representation Learning

New York University (NYU)
Add to list
0:00 / 0:00