Главная
Study mode:
on
1
Introduction
2
Motivation
3
Perceptron
4
Neural nets
5
Convolutional nets
6
convolutional neural nets
7
geometries
8
generalizations
9
training
10
machine learning for mathematicians
11
bruja graph
12
analytic polynomials
13
Examples
14
Visualizing combinatorial invariance
15
Predicting KLL polynomials
16
Saliency
Description:
Explore a fascinating lecture on the intersection of pure mathematics and machine learning, focusing on the combinatorial invariance conjecture in Representation Theory. Delve into the collaborative efforts between Geordie Williamson and DeepMind to apply modern machine learning techniques to pure mathematical problems. Discover how neural networks, convolutional nets, and other ML models were utilized to shed light on Kazhdan-Lusztig polynomials and their relationship to directed graphs. Learn about the challenges of extracting new mathematical insights from these models and the resulting formula that provides fresh perspectives on the combinatorial invariance conjecture. Gain insights into topics such as perceptrons, neural nets, geometries, generalization, training, bruja graphs, and analytic polynomials. Visualize combinatorial invariance, predict KLL polynomials, and understand the concept of saliency in this context.

Combinatorial Invariance: A Case Study of Pure Math / Machine Learning Interaction - Geordie Williamson

Institute for Advanced Study
Add to list
0:00 / 0:00