Главная
Study mode:
on
1
Intro
2
Overview
3
Differential privacy's theoretical guarantees
4
Example: abstracting away implementation details
5
Why does this matter?
6
Notation and Problem Setup
7
Adversary Public Information
8
DP and public information
9
Privacy given plausible data generating processes
10
Units of analysis for e-DPZ
11
Misspecification in z
12
Designing 4-DPZ mechanisms
13
Utility and Uncertainty Quantification
14
Mechanism choices and operationalization
15
What it means to correct for measurement error?
16
Methodological Transparency
17
Design vs. adjustment
18
Inferential Adjustment vs. Post-Processing
19
Ideas for the workshop
Description:
Explore a 32-minute lecture on misspecification and uncertainty quantification in differential privacy, delivered by Jeremy Seeman from The Pennsylvania State University at the Fields Institute. Delve into the theoretical guarantees of differential privacy, examining how implementation details are abstracted and why this matters. Learn about adversary public information, privacy in relation to plausible data generating processes, and units of analysis for e-DPZ. Investigate misspecification in z, the design of 4-DPZ mechanisms, and utility and uncertainty quantification. Discuss mechanism choices, operationalization, and the meaning of correcting for measurement error. Consider methodological transparency, design versus adjustment, and inferential adjustment versus post-processing. Gain insights into these complex topics as part of the "Workshop on Differential Privacy and Statistical Data Analysis."

Misspecification, and Uncertainty Quantification in Differential Privacy

Fields Institute
Add to list