Главная
Study mode:
on
1
Introduction
2
Creating The Spark Cluster and Airflow on Docker
3
Creating Spark Job with Python
4
Creating Spark Job with Scala
5
Building and Compiling Scala Jobs
6
Creating Spark Job with Java
7
Building and Compiling Java Jobs
8
Cluster computation results
Description:
Learn to set up and utilize Apache Airflow and Spark Clusters on Docker in this comprehensive video tutorial. Create an end-to-end data engineering project combining Apache Airflow, Docker, Spark Clusters, Scala, Python, and Java. Develop basic jobs using multiple programming languages, submit them to the Spark cluster for processing, and observe live results. Follow along as the instructor guides you through creating Spark jobs with Python, Scala, and Java, as well as building and compiling Scala and Java jobs. Gain hands-on experience in cluster computation and workflow automation, essential skills for big data analytics and data engineering projects.

End-to-End Data Engineering with Apache Airflow, Docker, and Spark Clusters - Using Python, Scala, and Java

CodeWithYu
Add to list
0:00 / 0:00