Главная
Study mode:
on
1
Introduction
2
Alexander Samuelsson Introduction
3
What is AI at imagiMob
4
What can you build with imagiMob
5
Working proof of concept
6
CES 2020 demo
7
CES 2021 demo
8
Challenges
9
Data collection
10
Preprocessing AI models
11
How are we different
12
The future
13
Audience questions
14
Sponsors
Description:
Learn how to develop advanced hand gesture controls using radar sensors and tinyML in this 33-minute tinyML Talks webcast. Explore a case study from Imagimob, pioneers in Edge AI, as they demonstrate the creation of gesture-controlled headphones showcased at CES 2020. Discover the process of developing Edge AI applications using Imagimob AI (SaaS), including data collection, preprocessing, and AI model creation. Gain insights into the challenges faced, future possibilities, and how this technology differs from other approaches. The presentation covers working proof of concepts, CES demos, and addresses audience questions, providing a comprehensive look at the intersection of radar technology and tinyML for innovative gesture control applications.

How to Build Advanced Hand-Gestures Using Radar and TinyML

tinyML
Add to list