Главная
Study mode:
on
1
Intro
2
Statistical Learning Theory
3
Neural Network Applications
4
Information Theory
5
Soft Partitioning
6
Information Plan
7
Stochastic Gradient Descent
8
Average Per Layer
9
Classical Theory
10
Dimensionality
11
Confidence
12
Factorization
13
Cardinality
14
The Ultimate Bound
Description:
Explore the Information Bottleneck Theory of Deep Neural Networks in this lecture by Naftali Tishby from the Hebrew University of Jerusalem. Delve into statistical learning theory, neural network applications, and information theory. Examine concepts such as soft partitioning, information plan, and stochastic gradient descent. Analyze the average per layer, classical theory, dimensionality, confidence, factorization, cardinality, and the ultimate bound. Gain insights into targeted discovery in brain data and expand your understanding of deep neural networks through this comprehensive presentation from the Simons Institute.

The Information Bottleneck Theory of Deep Neural Networks

Simons Institute
Add to list