Главная
Study mode:
on
1
Introduction
2
tensor network
3
example
4
contraction tree
5
hyperindices
6
partition
7
partition function
8
hypergraph partitioning
9
tensor network simplification
10
rank simplification
11
detailed simplifications
12
low rank decompositions
13
diagonal hyperindexes
14
gauge freedom
15
hybrid reduction
16
qaoa
17
weighted model counting
18
approximate contract
19
Conclusions
Description:
Explore tensor network contraction optimization techniques in this 33-minute conference talk from the Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 workshop. Delve into hyper-optimized methods based on hypergraph partitioning for building efficient contraction trees. Discover a set of powerful tensor network simplifications designed to facilitate easier contraction. Examine applications in quantum circuit simulation and weighted model counting. Gain insights into extending these concepts to approximate contraction. Learn from Johnnie Gray of the California Institute of Technology as he presents advanced strategies for tackling complex tensor network geometries and improving computational efficiency.

Hyper-Optimized Tensor Network Contraction - Simplifications, Applications and Approximations

Institute for Pure & Applied Mathematics (IPAM)
Add to list
0:00 / 0:00