Главная
Study mode:
on
1
Introduction
2
Features
3
Predictor
4
Control Plane
5
Replicas
6
CoopTree
7
Data Plane
8
Inference Graph
9
Standard Inference Protocol
10
Data Plane Plugins
11
Monitor Logger
12
Serving Runtime
13
MultiModel Serving
14
Use Cases
15
Inference Service
16
Conclusion
17
Questions
Description:
Explore the world of machine learning model serving with KServe in this informative conference talk featuring fun drawings. Dive into the fundamentals of KServe, an easy-to-use platform built on Kubernetes for deploying ML models. Learn about its high abstraction interfaces, performant solutions for common infrastructure issues, and features like GPU scaling and ModelMesh serving. Discover how KServe simplifies model deployment for data scientists and engineers, allowing them to focus on building new models. Examine key components such as the Predictor, Control Plane, Data Plane, and Inference Graph. Understand KServe's standard inference protocol, data plane plugins, and monitoring capabilities. Explore use cases, multi-model serving, and the project's roadmap towards its v1.0 release. Gain insights into KServe's evolution since 2019 and its exciting new functionalities that address the needs of ML practitioners.

Exploring ML Model Serving with KServe - Features and Use Cases

CNCF [Cloud Native Computing Foundation]
Add to list