Cost-Efficient Distributed Learning | Dr. Rawad Bitar | TUM
Description:
Watch a technical lecture exploring distributed gradient descent optimization in machine learning systems, focusing on strategies to handle stragglers in distributed computing environments. Learn about adaptive approaches for optimizing worker participation, including a novel cost-efficient strategy based on multi-armed bandit theory that selectively assigns tasks while learning worker speeds. Discover how these methods balance computation time, model quality, and resource utilization through theoretical analysis and numerical simulations. Gain insights into information theory and coding applications for distributed systems, particularly in machine learning and DNA-based data storage contexts, presented by Dr. Rawad Bitar from the Technical University of Munich.