Главная
Study mode:
on
1
Introduction
2
Quantitization
3
Models
4
Whats Appealing
5
Benefits
6
Notation
7
Tensor Train
8
Quantum Physics
9
General Power Tools
10
Machine Learning
11
Infinite Matrix Product States
12
Locally Purified States
13
Projected entangled pair states
14
Fixed mirror layers
15
Why should tensor networks work
16
Mutual information of image data
17
Algorithms
18
Local update
19
Density matrix
20
Applications
21
Downsides
Description:
Explore tensor networks for machine learning applications in this 31-minute conference talk by Miles Stoudenmire from the Flatiron Institute. Delve into the power and flexibility of tensor networks as factorizations of high-order tensors, offering exponential gains in memory and computing time. Discover how these networks define a class of model functions with benefits similar to kernel methods and neural networks. Examine optimization algorithms, theoretical underpinnings, and opportunities for matching model architectures to data classes. Learn about exciting recent applications and future research prospects in the field. Cover topics including quantization models, tensor train notation, quantum physics connections, infinite matrix product states, projected entangled pair states, mutual information in image data, local update algorithms, and potential downsides of tensor network approaches.

Tensor Networks for Machine Learning and Applications

Institute for Pure & Applied Mathematics (IPAM)
Add to list