Главная
Study mode:
on
1
Paper Details
2
Overview
3
Definition of Sensitivity Maps
4
Previous Work
5
Smooth Grad Proposal
6
Experiments - Models
7
Experiments - Visualization (Value of gradients)
8
Experiments - Visualization (Capping Values)
9
Experiments - Visualization (Multiplying with Input)
10
Experiments - Parameters
11
Experiments - Comparison to Baseline Methods
12
Experiments - Combining SmoothGrad
13
Experiments - Adding Noise During Training
14
Conclusion
15
For Paper
16
Against Paper
Description:
Explore a comprehensive lecture on SmoothGrad, a technique for improving the quality of sensitivity maps in deep neural networks. Delve into the paper's details, starting with an overview and definition of sensitivity maps. Examine previous work in the field before focusing on the SmoothGrad proposal. Analyze various experiments, including models used, visualization techniques, parameter adjustments, and comparisons to baseline methods. Investigate the combination of SmoothGrad with other techniques and the effects of adding noise during training. Conclude with a critical evaluation of the paper's strengths and weaknesses, providing a well-rounded understanding of this innovative approach to reducing noise in neural network visualizations.

CAP6412 - SmoothGrad: Removing Noise by Adding Noise - Lecture

University of Central Florida
Add to list
0:00 / 0:00