Главная
Study mode:
on
1
Introduction
2
Creating The Spark Cluster and Airflow on Docker
3
Creating Spark Job with Python
4
Creating Spark Job with Scala
5
Building and Compiling Scala Jobs
6
Creating Spark Job with Java
7
Building and Compiling Java Jobs
8
Cluster computation results
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Embark on a comprehensive data engineering journey with this full course that combines Apache Airflow, Docker, Spark Clusters, Scala, Python, and Java. Create an end-to-end project, developing basic jobs in multiple programming languages, submitting them to a Spark cluster for processing, and observing live results. Learn to set up a Spark Cluster and Airflow on Docker, create Spark jobs using Python, Scala, and Java, build and compile Scala and Java jobs, and analyze cluster computation results. Gain hands-on experience in workflow automation, big data analytics, and data processing techniques essential for aspiring data engineers.

Apache Airflow with Spark for Data Engineers - Full Course

CodeWithYu
Add to list
0:00 / 0:00