Главная
Study mode:
on
1
- Teaser
2
- Intro
3
- Little-o notation
4
- Checking is_little_o in Python
5
- Doing math in Python with SymPy
6
- What does little-o mean?
7
- Exercise: is_little_o_x
8
- The gradient is a linear approximation
9
- Different meanings of "the" gradient
10
- Gradient of a constant function
11
- Exercise: Making a linear_approximation
12
- Gradients and optimization
13
- Exercise: Gradient descent
14
- Outro
Description:
Dive into a 43-minute video tutorial on calculus exercises for machine learning, led by Weights & Biases experts Charles Frye and Scott Condron. Explore key concepts like little-o notation, gradients as linear approximations, and gradient descent through hands-on Python exercises using SymPy. Learn to implement and understand crucial mathematical foundations for ML, including checking little-o conditions, creating linear approximations, and applying gradients in optimization. Follow along with provided GitHub resources and complementary materials to deepen your understanding of calculus in the context of machine learning.

Math4ML Exercises - Calculus

Weights & Biases
Add to list
0:00 / 0:00