Главная
Study mode:
on
1
Intro
2
An Example: Digits Recognition with MNist
3
What is Hyperparameter Tuning?
4
Why is Hyperparameter Tuning Hard?
5
How does Kubernetes Help?
6
Introducing Kubeflow
7
Katib: Hyperparameter Tuning in Kubeflow
8
Concepts: Experiment
9
Concepts: Suggestion
10
Concepts: Trial
11
Workflow for Hyperparameter Tuning
12
System Architecture
13
Demo: Setting Up an Experiment
14
Demo: Configuring Search Space
15
Demo: Viewing Experiment Results
16
Demo: Viewing Trial Metrics
17
Classical vs Automated Machine Learning
18
Landscape of Automated Machine Learning
19
Workflow for Neural Architecture Search
20
What's Coming?
21
How to Contribute?
Description:
Explore hyperparameter tuning and neural architecture search using Katib, a Kubernetes-native automated machine learning platform within Kubeflow. Learn how to optimize model performance by finding optimal constraints for training, and discover how networks generated by NAS algorithms can outperform handcrafted neural networks. Dive into Katib's rich set of management APIs, configure and run experiments, and compare performance using the UI dashboard. Follow along with demonstrations on setting up experiments, configuring search spaces, and viewing results and trial metrics. Gain insights into the landscape of automated machine learning, understand the workflow for neural architecture search, and learn about future developments and opportunities to contribute to this cutting-edge technology.

Hyperparameter Tuning Using Kubeflow

Linux Foundation
Add to list
0:00 / 0:00