Главная
Study mode:
on
1
Introduction
2
Sponsor: NordVPN
3
What is probability Bayesian vs Frequentist
4
Probability Distributions
5
Entropy as average surprisal
6
Cross-Entropy and Internal models
7
Kullback–Leibler KL divergence
8
Objective functions and Cross-Entropy minimization
9
Conclusion & Outro
Description:
Explore the fundamental concepts of probability theory and its applications in neuroscience and machine learning in this 26-minute video. Delve into the intuitive idea of surprise and its relation to probability through real-world examples. Examine advanced topics such as entropy, cross-entropy, and Kullback-Leibler (KL) divergence. Learn how to measure the average surprise in a probability distribution, understand the loss of information when approximating distributions, and quantify differences between probability distributions. Gain insights into Bayesian and Frequentist approaches to probability, probability distributions, and the role of objective functions in cross-entropy minimization.

The Key Equation Behind Probability - Entropy, Cross-Entropy, and KL Divergence

Artem Kirsanov
Add to list
0:00 / 0:00