Главная
Study mode:
on
1
Stochastic Gradient Descent and Machine Learning Lecture 1
2
5 different facets of optimization
3
Optimization
4
1. Iterative methods
5
Blackbox oracles
6
2. Gradient descent
7
3. Newton's method
8
Cheap gradient principle
9
Fixed points of GD
10
Proposition
11
Proof
12
Convexity
13
Examples of convex functions
14
Theorem
15
Proof
16
gx is subgradient of a convex function f at x
17
Example
18
Theorem
19
Claim
20
Wrap Up
Description:
Dive into the fundamentals of optimization and machine learning in this comprehensive lecture on Stochastic Gradient Descent. Explore five different facets of optimization, including iterative methods, gradient descent, and Newton's method. Gain insights into the cheap gradient principle, fixed points of gradient descent, and the concept of convexity. Examine various examples of convex functions and delve into important theorems and proofs. Learn about subgradients of convex functions and their applications. This in-depth session, part of the Bangalore School on Statistical Physics XIII, provides a solid foundation for understanding the core principles of optimization techniques used in machine learning algorithms.

Stochastic Gradient Descent and Machine Learning - Lecture 1

International Centre for Theoretical Sciences
Add to list
0:00 / 0:00