Главная
Study mode:
on
1
Intro
2
Non-convex optimization
3
Two major observations
4
State of the art
5
Summary of results
6
Setting
7
Perturbed gradient descent
8
Key question
9
Two dimensional quadratic case
10
Three dimensional quadratic case
11
General case
12
Two key ingredients of the proof
13
Proof idea
14
Putting everything together
15
Open questions
Description:
Explore the intricacies of escaping saddle points efficiently in this 51-minute conference talk by Praneeth Netrapalli at the International Centre for Theoretical Sciences. Delve into non-convex optimization techniques, examining two major observations and the current state of the art. Gain insights into perturbed gradient descent, analyzing its application in two-dimensional, three-dimensional, and general quadratic cases. Understand the key ingredients and proof ideas behind efficient saddle point escape methods. Conclude with a discussion on open questions in the field, providing a comprehensive overview of this crucial topic in algorithms and optimization.

How to Escape Saddle Points Efficiently? by Praneeth Netrapalli

International Centre for Theoretical Sciences
Add to list
0:00 / 0:00