Главная
Study mode:
on
1
Classifying cows versus penguins
2
A common issue
3
Example: cows v. penguins
4
Idea 1: nuisance randomization
5
Doesn't work by itself
6
Idea 2: Uncorrelating representations
7
What's the best uncorrelating representation?
8
A simulation
9
Waterbirds versus landbirds
10
Pneumonia classification
11
Related work
12
What did I mean by local optima?
13
Did we lose anything?
14
Natural language inference
15
Generic setup not solvable
16
One assumption is test = train
17
Invariant coupling
18
Subgroup Coupling
19
Common factor coupling
20
What does good mean?
21
Simultaneous Optimality?
22
Why does causality appear?
23
The added flexibility comes with an estimation cost
24
Some questions
25
Research References
Description:
Explore a comprehensive lecture on out-of-distribution generalization in machine learning, delivered by Rajesh Ranganath at the Computational Genomics Summer Institute. Delve into key concepts such as nuisance-induced spurious correlations, invariant risk minimization, and distributionally robust neural networks. Examine real-world examples including cow vs. penguin classification, waterbirds vs. landbirds, and pneumonia detection. Investigate techniques like nuisance randomization and uncorrelating representations to address common issues in machine learning models. Analyze the challenges of natural language inference and explore various coupling assumptions. Gain insights into the role of causality in machine learning and the trade-offs between flexibility and estimation costs. Access related research papers for further study on out-of-distribution generalization techniques and their applications in computational genomics.

Out of Distribution Generalization

Computational Genomics Summer Institute CGSI
Add to list
0:00 / 0:00