Главная
Study mode:
on
1
Intro
2
Outline Background and Overview
3
RandNLA: Randomized Numerical Linear Algebra
4
Basic RandNLA Principles
5
Element-wise Sampling
6
Row/column Sampling
7
Random Projections as Preconditioners
8
Approximating Matrix Multiplication
9
Subspace Embeddings
10
Two important notions: leverage and condition
11
Meta-algorithm for E-norm regression (2 of 3)
12
Meta-algorithm for Iz-norm regression (3 of 3)
13
Least-squares approximation: the basic structural result
14
Least-squares approximation: RAM implementations
15
Extensions to Low-rank Approximation (Projections)
Description:
Explore the foundations of randomized numerical linear algebra in this lecture from the Foundations of Data Science Boot Camp. Delve into key concepts like element-wise sampling, row/column sampling, and random projections as preconditioners. Learn about approximating matrix multiplication, subspace embeddings, and the importance of leverage and condition in algorithms. Examine meta-algorithms for E-norm and Iz-norm regression, and discover structural results for least-squares approximation. Gain insights into RAM implementations and extensions to low-rank approximation using projections. Presented by Michael Mahoney from the International Computer Science Institute and UC Berkeley, this comprehensive talk provides a deep dive into sampling techniques for linear algebra, statistics, and optimization.

Sampling for Linear Algebra, Statistics, and Optimization I

Simons Institute
Add to list
0:00 / 0:00