Главная
Study mode:
on
1
- Teaser
2
- Intro
3
- Implementing entropies
4
- Exercise: surprise
5
- Why do models output probabilities?
6
- Exercise: entropy
7
- Exercise: crossentropy
8
- Exercise: divergence
9
- Loss functions and surprises
10
- Exercise: softmax_crossentropy
11
- Putting it all together with Gaussians
12
- Exercise: gaussian_surprise
13
- Gaussian surprise and squared error
14
- Exercise: Gradient descent on a surprise
15
- Why are these exercises useful?
16
- How programmers can learn more math
Description:
Dive into a 44-minute video tutorial on probability exercises for machine learning, led by Weights & Biases experts Charles Frye and Scott Condron. Work through practical implementations of entropy, surprise, cross-entropy, and divergence concepts. Explore the relationship between loss functions and surprises, and apply these principles to Gaussian distributions. Gain hands-on experience with gradient descent on surprise functions. Understand the importance of probability in machine learning models and discover how programmers can enhance their mathematical skills for ML applications. Access additional resources, including GitHub exercises and related lectures, to further deepen your understanding of probability in the context of machine learning.

Math for Machine Learning - Exercises: Probability

Weights & Biases
Add to list