Главная
Study mode:
on
1
Intro
2
Optimizing over nonnegative polynomials
3
1. Shape-constrained regression
4
2. Difference of Convex (DC) programming Problems of the form min fo (x)
5
Monotone regression: problem definition
6
NP-hardness and SOS relaxation
7
Approximation theorem
8
Numerical experiments (1/2) • Low noise environment
9
Difference of Convex (dc) decomposition
10
Existence of dc decomposition (2/3)
11
Convex-Concave Procedure (CCP)
12
Picking the "best" decomposition for CCP
13
Undominated decompositions (1/2)
14
Comparing different decompositions (1/2)
15
Main messages • Optimization over nonnegative polynomials has many applications Powerful SDP/SOS-based relaxations available.
16
Uniqueness of dc decomposition
Description:
Explore a comprehensive lecture on nonnegative polynomials, nonconvex polynomial optimization, and their applications in learning. Delve into shape-constrained regression, difference of convex (DC) programming, and monotone regression, including problem definitions and NP-hardness. Examine SOS relaxation techniques, approximation theorems, and numerical experiments in low noise environments. Investigate DC decomposition, the Convex-Concave Procedure (CCP), and strategies for selecting optimal decompositions. Gain insights into undominated decompositions and learn to compare different approaches. Understand the wide-ranging applications of optimization over nonnegative polynomials and the power of SDP/SOS-based relaxations in this field.

Nonnegative Polynomials, Nonconvex Polynomial Optimization, and Applications to Learning

Simons Institute
Add to list
0:00 / 0:00