Главная
Study mode:
on
1
Intro
2
What is Kiwi
3
Important facts
4
Waste checker service
5
Machine learning
6
Data curation
7
ML infrastructure
8
Data Monitoring
9
Importance of Monitoring
10
Spark Jobs
11
Silver Line
12
Monitoring Jobs
13
Monitoring Dashboard
14
Monitoring
15
Bronze Layer Table
16
Daily Data Injection
17
Job Configuration
18
Slack Integration
19
Data Quality Check
20
External Monitoring System
21
Summary
22
Outro
Description:
Explore a comprehensive 30-minute presentation on monitoring large-scale machine learning models, IoT streaming data, and automated quality checks using Delta Lake. Dive into Quby's innovative approach to managing Europe's largest energy dataset, consisting of petabytes of IoT data. Learn how Delta Lake ensures data quality through schema enforcement and evolution, and discover the crucial role of Data Engineers in verifying timely data ingestion with expected metrics. Examine the challenges of training and serving over half a million models daily, and understand the importance of balancing quality data with well-performing models. Gain insights into monitoring raw and processed data quality metrics using Databricks dashboards, tracking model performance with MLflow, and implementing Slack alerts for failure notifications. Explore real-world examples of managing large-scale data processing and machine learning pipelines in production environments.

Monitoring ML Models, IoT Data, and Quality Checks on Delta Lake - Quby's Approach

Databricks
Add to list
0:00 / 0:00