Background: Learning to Classify images with Convolution Neural Networks
4
Background XAI
5
XAI tools have proved useful for identifying when a decisi based on irrelevant information.
6
XAI vs Expert Explanations
7
What's the role of an explanation?
8
Pneumonia Detection
9
Deep Learners find one sufficient means of disting
10
Center of Mass for explanations
11
Variability on Explanations can be reduced by ensembles of classifiers and averaging heatmaps
12
Averaging Multiple Explanations
13
Data Sets
14
Experimental setup
15
Evaluation on Experienced Birdwatchers
16
Would you trust a robot to identify and remove cancerous moles?
17
Comments on XAI
18
Diagnostic Features: Melanomia
19
Do experienced birders prefer arrows labeled with diagnostic features
20
Do Novices Learn faster when given diagnostic fea
21
Learning to label with diagnostic/explanatory fae
22
Multi-task learning for classification and diagnostic features
23
Putting it all together
24
Papers
Description:
Explore expert-informed, user-centric explanations for image classification using deep learning in this 51-minute talk by Michael Pazzani from USC Information Sciences Institute. Delve into a novel approach that labels image regions with diagnostic features, addressing the limitations of heatmap annotations for users unfamiliar with deep learning. Examine the challenge of rare features in learning, the use of multiple models to improve region identification accuracy, and applications in radiology, ophthalmology, dermatology, and bird classification. Learn about the speaker's extensive background in research, academia, and government roles, including his current position as Principal Scientist at USC's Information Sciences Institute and his appointment to the Defense Science Board.
Expert-Informed, User-Centric Explanations for Image Classification with Deep Learning