Главная
Study mode:
on
1
Introduction
2
Probability is subtle
3
Overview of takeaways
4
Probability is like mass
5
Surprises show up more often in ML
6
Surprises give rise to loss functions
7
Surprises are better than densities
8
Gaussians unite probability and linear algebra
9
Summary of the Math4ML ideas
10
Additional resources on Math4ML
Description:
Explore the fundamental concepts of probability essential for machine learning in this 45-minute video lecture. Delve into the challenges of mathematically rigorous probability theory and discover why negative logarithms of probabilities, known as "surprises," are prevalent in machine learning. Learn how probability behaves like mass, how surprises relate to loss functions, and why they are preferable to densities. Examine the connection between Gaussians, probability, and linear algebra. Access accompanying slides and exercise notebooks for hands-on practice. Gain valuable insights into the Math for Machine Learning series, with timestamps provided for easy navigation through key topics.

Probability - Math for Machine Learning

Weights & Biases
Add to list
0:00 / 0:00