Главная
Study mode:
on
1
Introduction
2
Motivation
3
Turbulence
4
Global modeling
5
The challenge
6
Multiskill modeling
7
Global storm resolving models
8
A silly first attempt
9
Aerosol cloud indirect effects
10
Regionalization
11
GPU Computing
12
Creative Complexity
13
Short Simulations
14
Course Graining
15
Super Crude Architecture
16
Lessons emerging
17
Feature engineering
18
Separate processes
19
Microphysical rates
20
Example
21
Constraints
22
Tradeoffs
23
Generalization
24
Strategy
25
Preprint
26
Results
27
Physical Credibility
28
Hyperparameter Tuning
29
Missing Information
30
Neural Network Tuning
31
Summary
32
Cognitive dissonance
33
Excitement
34
Thank you
35
Maria
36
Reporting failures
37
Retraining neural networks
38
Sampling
39
Failures
Description:
Explore lessons and future prospects for machine learning parameterization of sub-grid atmospheric physics from the perspective of emulating cloud superparameterization in this 42-minute conference talk. Delve into the challenges of global modeling, multiskill modeling, and GPU computing in climate science. Examine creative approaches to short simulations, course graining, and feature engineering. Analyze the tradeoffs, generalization strategies, and physical credibility of neural network models in atmospheric physics. Gain insights into hyperparameter tuning, missing information, and the importance of reporting failures in ML-based climate modeling. Conclude with a discussion on cognitive dissonance, excitement, and the future of machine learning in atmospheric science.

Lessons and Outlook for ML Parameterization of Sub Grid Atmospheric Physics From the Vantage of Emulating Cloud Superparameterization - Mike Pritchard

Kavli Institute for Theoretical Physics
Add to list