Robust training: from federated towards decentralized
9
Byzantine workers in the graph topology
10
Collaborative Learning
11
Special Case: Mean Estimation
12
Personalized learning/optimization
13
References
Description:
Watch a 24-minute conference talk from INSAIT 2022 where Prof. Martin Jaggi from EPFL explores the evolution and challenges of decentralized learning systems. Dive into key concepts including gradient descent, federated learning, and Byzantine robust training. Learn about the progression from federated to decentralized approaches, examining how momentum and historical data can enhance training robustness. Explore collaborative learning frameworks, with special attention to mean estimation and personalized optimization. The presentation covers theoretical foundations through practical applications, including detailed analysis of Byzantine workers in graph topology and the implications for system resilience.