Главная
Study mode:
on
1
Intro
2
Why is non-convex optimization "easy"?
3
Locally optimizable functions
4
What happens when assumptions fail?
5
Robust non-convex optimization with perturbed objective
6
Motivation: Empirical Risk vs. Population Risk.
7
Idea: Smoothing
8
Properties of Smoothing
9
Ideas of the Lower Bound
10
Matrix Completion
11
Semi-Random Adversary
12
Counter Examples
13
Preprocessing
14
Summary
15
Open Problems
Description:
Explore the challenges and possibilities of robust non-convex optimization in this 46-minute lecture by Rong Ge from Duke University. Delve into the reasons behind the perceived ease of non-convex optimization, examine locally optimizable functions, and investigate the consequences of failed assumptions. Learn about robust non-convex optimization techniques using perturbed objectives, and understand the motivation behind comparing empirical risk to population risk. Discover the concept of smoothing and its properties, as well as ideas for establishing lower bounds. Examine matrix completion, semi-random adversaries, and counter-examples. Gain insights into preprocessing techniques and conclude with a summary of key points and open problems in the field of robust and high-dimensional statistics.

Can Non-Convex Optimization Be Robust?

Simons Institute
Add to list