Главная
Study mode:
on
1
Introductory lectures on first-order convex optimization Lecture 1
2
Gradient based optimization
3
Complexity of implementing an oracle and Complexity of optimization given access to an oracle
4
Gradient Descent
5
Theorem
6
Remark
7
Proof
8
Rearrange and telescopic sum gives
9
Lower bounds: Theorem
10
Smoothness
11
Theorem
12
Proof
13
Nesterov's accelerated gradients algorithm
14
Estimate Sequences
15
Lemma
16
Proof
17
Observation
18
Compute
Description:
Dive into the fundamentals of first-order convex optimization in this comprehensive lecture by Praneeth Netrapalli. Explore gradient-based optimization techniques, including gradient descent and Nesterov's accelerated gradients algorithm. Examine the complexity of implementing oracles and optimization processes, and delve into key theorems, proofs, and lower bounds. Gain insights into smoothness concepts and estimate sequences. Analyze rearrangements, telescopic sums, and crucial observations to deepen your understanding of convex optimization principles. Perfect for advanced graduate students, postdocs, and researchers in theoretical physics and computer science seeking to enhance their knowledge of machine learning and statistical physics applications.

Introductory Lectures on First-Order Convex Optimization - Lecture 1

International Centre for Theoretical Sciences
Add to list
0:00 / 0:00