Главная
Study mode:
on
1
Intro
2
What you will learn
3
Before we start
4
What is the likelihood?
5
Example: Balls in urns
6
Maximum likelihood estimator
7
Example: Coin flips
8
Likelihood - Cost
9
Back to the urn problem...
10
Grid search (brute force)
11
Local vs. global minima
12
Convex vs. non-convex functions
13
Implementation
14
Lecture attendance problem
15
Multi-dimensional gradients
16
Multi-dimensional gradient descent
17
Differentiable functions
18
Optimization for machine learning
19
Stochastic gradient descent
20
Regularization
21
Sparse coding
Description:
Dive into the fundamentals of optimization in this 1-hour 12-minute tutorial led by Kevin Smith from MIT. Explore key concepts such as maximum likelihood estimation, cost functions, and gradient descent methods. Learn about convex and non-convex functions, local and global minima, and their implications in optimization problems. Discover practical applications through examples like balls in urns and coin flips. Advance to multi-dimensional gradients and their role in machine learning. Gain insights into stochastic gradient descent, regularization techniques, and sparse coding. Perfect for those looking to enhance their understanding of optimization principles and their applications in various fields, including machine learning.

Introduction to Optimization

MITCBMM
Add to list