Главная
Study mode:
on
1
Intro
2
Overview
3
Why is symmetry useful
4
Invariant vs equivariant models
5
Equivariant methods
6
Global symmetry equivalent
7
Questions
8
Applications
9
Molecular force fields
10
Scaling
11
Longrange interactions
12
Predicting charge density
13
Using a neural network
14
First paper
15
Competition
16
Efficiency
17
Advanced properties
18
Neural networks
19
Representation theory
20
Reducible representation
21
Future spaces
22
Spherical harmonic projections
23
Invariance
24
General questions
25
Emergent behavior
26
Curious principle
27
Structural phase transitions
28
First case
29
Order parameters
30
Why not just train a model
31
Another example
Description:
Explore a comprehensive lecture on symmetry-preserving neural networks and their applications in breaking symmetry, presented by Tess Smidt from MIT at IPAM's Learning and Emergence in Molecular Systems Workshop. Delve into the data efficiency and generalization capabilities of equivariant neural networks across various domains, including computer vision and atomic systems. Discover how these networks can learn symmetry-breaking information to fit datasets with potential missing information, while maintaining minimal symmetry breaking due to their mathematical guarantees. Examine network architectures designed to learn symmetry-breaking parameters in two distinct settings: global asymmetries and individual example predictions. Gain insights into practical applications, including predicting structural distortions of crystalline materials, and explore topics such as molecular force fields, charge density prediction, representation theory, and structural phase transitions.

Learning How to Break Symmetry With Symmetry-Preserving Neural Networks - IPAM at UCLA

Institute for Pure & Applied Mathematics (IPAM)
Add to list
0:00 / 0:00