Главная
Study mode:
on
1
Welcome!
2
Introduction by the speaker
3
Acknowledgments
4
Algebraic solvers are fundamental tools
5
Mathematical libraries in which development we were involved
6
Two main themes of the talk
7
Kernel methods in ML
8
Kernel Ridge Regression (KRR)
9
Solving large sense linear systems
10
Low-rank compression
11
Classes of low-rank structured matrices
12
Cluster tree of matrix
13
Fast algebraic algorithm: sketching
14
Problem: we don't know the target rank
15
Stochastic norm estimation
16
Example: compression of HSS matrix
17
Fast geometric algorithm: approximate nearest neighbor
18
Approximate nearest neighbor with iterative merging
19
Comparison of algebraic and geometric algorithms
20
STRUMPACK (STRUctured Matrix PACKage)
21
Linear algebra and machine learning
22
Bayesian optimization
23
Modeling phase
24
Search phase
25
Parallelization of code execution
26
Examples of ML improved linear algebra computations
27
Summary
28
Q&A: What do we need more: linear algebra code for new architectures or for new applications?
29
Q&A: How we can give users the ability to use ML to get performance?
30
Q&A: What developments do you want to see in the Julia ecosystem?
31
Q&A: What high-performance algorithms can make use of specific code generation?
32
Q&A: Do you think that Julia can replace C++ as the language for linear algebra?
33
Q&A: Do you search for rank revealing LU?
34
Announcements
Description:
Explore the interplay between linear algebra, machine learning, and high-performance computing in this keynote address from JuliaCon 2021. Delve into the use of hierarchical matrix algebra for constructing low-complexity linear solvers and preconditioners, and learn how these fast solvers can accelerate large-scale PDE-based simulations and AI algorithms. Discover how statistical and machine learning methods can optimize solver selection and configuration. Examine recent developments in fast algebraic and geometric algorithms, including sketching and approximate nearest neighbor techniques. Gain insights into the STRUMPACK library and its applications. Investigate the use of Bayesian optimization in improving linear algebra computations. Engage with a Q&A session covering topics such as linear algebra code development, performance optimization using machine learning, and the potential of Julia in high-performance computing.

Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote

The Julia Programming Language
Add to list
0:00 / 0:00