Training with DP gradients for general convex fitness functions Slower decaying learning rate
6
Value of collaboration
7
Experiment with loan data
8
Convergence of learning algorithm
9
Prediction vs reality
10
Conclusions and future work
Description:
Explore the value of collaboration in convex machine learning with differential privacy in this IEEE conference talk. Delve into the application of machine learning to distributed private data owned by multiple entities, utilizing noisy, differentially-private gradients to minimize fitness costs through stochastic gradient descent. Examine the trade-off between privacy and utility in machine learning by quantifying model quality as a function of privacy budget and dataset size. Discover how to predict collaboration outcomes among privacy-aware data owners before executing computationally-expensive algorithms. Learn about the inverse relationship between model fitness differences and dataset size and privacy budget. Validate performance predictions with practical applications in financial datasets, including interest rate determination for loans using regression and credit card fraud detection using support vector machines. Gain insights into training with differentially private gradients, the convergence of learning algorithms, and the future of privacy-aware machine learning.
Read more
The Value of Collaboration in Convex Machine Learning with Differential Privacy