Главная
Study mode:
on
1
Introduction
2
Data Engineering vs ETL
3
How to become successful with ETL
4
Its bad for Spark
5
This is 2020
6
What does Butdo look like
7
Engineering Tools
8
Visual ETL
9
Standardized Components
10
Metadata
11
Continuous Deployment
12
Compilers
13
Demo
14
Prophecy
Description:
Explore a 25-minute conference talk that challenges traditional ETL tools and proposes a new approach to Apache Spark development. Delve into the evolution of data engineering practices, from ETL tools to code-based solutions, and discover why current methods may be falling short. Learn about innovative tools designed to enhance Spark development, focusing on productivity, code standardization, metadata management, lineage tracking, and agile CI/CD processes. Gain insights into the potential of a new generation of development tools that combine the benefits of code-based approaches with the standardization and productivity features of traditional ETL tools. Witness a demonstration of Prophecy, a tool embodying these new principles, and understand how it aims to revolutionize Apache Spark development for modern data engineering needs.

Re-Imagining Apache Spark Development - Tools for Productivity and Standardization

Databricks
Add to list
0:00 / 0:00