Interpretable Representations for Objects and Scenes
10
Evaluate Unit for Semantic Segmentation
11
IMAGENET Pretrained Network
12
Class Activation Mapping: Explain Prediction of Deep Neural Network
13
Evaluation on Weakly-Supervised Localization
14
Explaining the Failure Cases in Video
15
Interpreting Medical Models
16
Summary of Contributions
17
Temporal Relational Networks for Event Recognition
18
Acknowledgement
Description:
Explore a comprehensive thesis defense presentation on interpretable representation learning for visual intelligence. Delve into deep neural networks for object classification, network visualization techniques, and interpretable representations for objects and scenes. Learn about class activation mapping for explaining deep neural network predictions, weakly-supervised localization, and temporal relational networks for event recognition. Gain insights into the interpretability of medical models and understand the contributions made to the field of visual intelligence.
Interpretable Representation Learning for Visual Intelligence