Главная
Study mode:
on
1
Intro
2
Federated Learning: Bringing Training to the Edge
3
Federated Optimization: Objectives and Notation
4
Federated Optimization: The FedAvg Algorithm
5
Distributed Mean Estimation
6
Leveraging Spatial Similarity
7
Rand-k Sparsification
8
How do we measure spatial correlation?
9
Family of Spatial Estimators for different p
10
Talk Outline
11
Proposed Rand-k-Temporal Estimator
12
Quadratic Objective Case Study
13
Distributed Power Iteration
14
Distributed K-Means
15
Distributed Logistic Regression
16
Error of Intermittent Client Participation on the convergence of FedAvg
17
Conclusion
Description:
Watch a 56-minute FLOW seminar presentation where Carnegie Mellon University's Gauri Joshi explores the optimization of distributed learning systems by leveraging spatial and temporal correlations. Dive into federated learning concepts, starting with training at the edge and progressing through federated optimization objectives and the FedAvg algorithm. Learn about distributed mean estimation techniques, spatial similarity leveraging, and Rand-k sparsification methods. Explore practical applications through case studies including distributed power iteration, K-means clustering, and logistic regression. Examine the impact of intermittent client participation on FedAvg convergence while gaining insights into measuring spatial correlation and implementing various spatial estimators. Originally presented on January 18th, 2023, as part of the Federated Learning One World Seminar series, this technical talk provides a comprehensive exploration of advanced distributed learning optimization strategies. Read more

Leveraging Spatial and Temporal Correlations in Distributed Learning - Seminar 91

Federated Learning One World Seminar
Add to list
0:00 / 0:00