Главная
Study mode:
on
1
Intro
2
One-sentence Summary
3
Differential Privacy
4
Noisy SGD
5
Regularized Exponential Mechanism (RegEM)
6
Isoperimetric Inequality for strongly log- concave measures
7
Concentration bounds for Lipschitz functions
8
Proof Sketch
9
Utility Analysis
10
A Question from the Duck
11
DP-Stochastic Convex Optimization (SCO)
12
Intuition
13
Open Problems
14
RegEM Revisited
15
Bounding Generalization Error
16
Bound Wasserstein Distance
17
Bounding KL. divergence
18
Bounding Population Loss
19
Summary of Contributions
20
A new sampling algorithm
21
Algorithms for DP-ERM and DP-SCO
Description:
Explore private convex optimization through the exponential mechanism in this Google TechTalk presented by Daogao Liu. Delve into differential privacy for machine learning, covering topics such as noisy stochastic gradient descent and the regularized exponential mechanism. Examine isoperimetric inequality for strongly log-concave measures and concentration bounds for Lipschitz functions. Learn about DP-Stochastic Convex Optimization and its intuition. Discover new sampling algorithms and their applications in DP-ERM and DP-SCO. Gain insights into bounding generalization error, Wasserstein distance, KL divergence, and population loss. Understand the contributions and open problems in this field of private convex optimization.

Private Convex Optimization via Exponential Mechanism - Differential Privacy for Machine Learning

Google TechTalks
Add to list
0:00 / 0:00