Explore a Google TechTalk presented by Ashok Cutkosky on differentially private online to batch conversion in stochastic optimization. Delve into the challenges of privacy-preserving algorithms and learn about a novel approach that bridges the gap between simple but suboptimal methods and complex but theoretically optimal techniques. Discover how this new variation on the classical online-to-batch conversion can transform any online optimization algorithm into a private stochastic optimization algorithm, potentially achieving optimal convergence rates. Throughout the 50-minute talk, examine key concepts such as DP-SGD, bespoke analysis, online linear optimization, and tree aggregation. Gain insights into the applications of this method, including adaptivity and parameter-free comparator adaptation, while considering the implications for privacy in machine learning and optimization practices.
Differentially Private Online-to-Batch Conversion for Stochastic Optimization