Главная
Study mode:
on
1
Intro
2
Problem and Challenges
3
Background
4
KubeEdge Architecture
5
Akraino KubeEdge Edge Service Blueprint
6
KubeEdge ML Offloading Functional Block Diagram
7
Use Case: Device App ML model inference offloading workflow
8
Edge Al Challenges
9
KubeEdge-AI
10
Service Architecture
11
Edge-cloud Collaborative JOINT INFERENCE Improve the inference performance, when edge resources are imited
12
Edge-cloud Collaborative INCREMENTAL LEARNING The more models are used the smarter they are
13
Edge-cloud collaborative FEDERATED LEARNING Raw data is not transmitted out of the edge, and the model is generated by
14
Developer perspective: JOINT INFERENCE
15
Developer perspective: FEDERATED LEARNING
16
Resource Information
Description:
Explore the architecture and implementation of edge AI stack and AI-as-a-Service in a cloud-native environment. Delve into the KubeEdge architecture, Akraino KubeEdge Edge Service Blueprint, and ML offloading functional block diagram. Examine use cases for device app ML model inference offloading workflows and address edge AI challenges. Learn about the KubeEdge-AIService architecture and edge-cloud collaborative techniques such as joint inference, incremental learning, and federated learning. Gain insights into developer perspectives on joint inference and federated learning, as well as resource information for building robust edge AI solutions.

Building Edge AI Stack and AI-as-a-Service in Cloud Native Way

Linux Foundation
Add to list