Главная
Study mode:
on
1
Introduction
2
Background
3
Traditional methods
4
Full configuration interaction
5
Convergence
6
Projective estimator
7
Random sparsification
8
Bias
9
Sparsification
10
Fri algorithm
11
Population mixing
12
Random matrix multiplication
13
Spectral gap
14
Step 2 random sparsification
15
Orthogonalization
16
Summary
17
Conclusion
Description:
Explore a conference talk on approximating matrix eigenvalues using subspace iteration with repeated random sparsification. Delve into Robert Webber's presentation from the California Institute of Technology at IPAM's Monte Carlo and Machine Learning Approaches in Quantum Mechanics Workshop. Discover how iterative random sparsification methods can estimate multiple eigenvalues at reduced computational cost, particularly beneficial for high-dimensional problems in quantum chemistry. Follow the progression from traditional numerical methods to innovative approaches leveraging random sampling and averaging. Gain insights into full configuration interaction, convergence, projective estimators, and the intricacies of random sparsification techniques. Understand the impact of bias, population mixing, and random matrix multiplication on eigenvalue approximation. Examine the role of spectral gaps, orthogonalization, and the FRI algorithm in enhancing computational efficiency for quantum chemistry benchmark problems. Read more

Approximate Matrix Eigenvalues, Subspace Iteration With Repeated Random Sparsification

Institute for Pure & Applied Mathematics (IPAM)
Add to list