Главная
Study mode:
on
1
Intro
2
Collaborators
3
Self-Driving must be Robust
4
Situational Driving
5
Data Aggregation
6
Adversarial Attacks on Image Classification
7
Adversarial Attacks on Semantic Segmentation
8
Physical Adversarial Attacks
9
Robust Adversarial Attacks
10
Adversarial Patch Attacks
11
Low-Level Perception
12
Motion Estimation
13
Variational Optical Flow
14
Encoder-Decoder Networks
15
Spatial Pyramid Networks
16
Motivation
17
Attacking Optical Flow
18
White Box Attacks
19
Black-Box Attacks
20
Real-World Attack
21
Zero-Flow Test
22
Summary
Description:
Explore a keynote presentation on attacking optical flow in deep neural networks for automated driving safety. Delve into the vulnerability of state-of-the-art optical flow estimation techniques to adversarial attacks, particularly focusing on patch attacks that can significantly compromise performance. Examine the differences in susceptibility between encoder-decoder and spatial pyramid network architectures. Learn about various types of attacks, including white-box, black-box, and real-world scenarios, as well as their implications for self-driving technology. Gain insights into the importance of robust artificial intelligence in safety-critical applications and the challenges of situational driving and data aggregation.

Attacking Optical Flow

Andreas Geiger
Add to list
0:00 / 0:00