Главная
Study mode:
on
1
Intro
2
Model Development
3
Software Engineering
4
Transparent Code
5
Building Blocks
6
Code
7
Demonstration
8
Summary
Description:
Explore distributed deep learning with Maggy, an open-source framework that bridges the gap between single-host Python and cluster-scale PySpark programs. Learn how to create reusable training functions that seamlessly transition from laptop development to cluster environments. Discover best practices for writing TensorFlow programs, factoring out dependencies, and implementing popular programming idioms. Experience the power of iterative model development in a single Jupyter notebook, mixing vanilla Python and PySpark-specific code. Gain insights into the benefits of distributed deep learning, including faster training with multiple GPUs, parallel hyperparameter tuning, and ablation studies. Understand how Maggy enables DRY (Don't Repeat Yourself) code principles in training functions, allowing for efficient development across different computing environments. Witness a practical demonstration of Maggy's capabilities and learn how to leverage this framework to streamline your deep learning workflows. Read more

From Python to PySpark and Back Again - Unifying Single-Host and Distributed Deep Learning with Maggy

Databricks
Add to list
0:00 / 0:00