Least-squares approximation: the basic structural result
14
Least-squares approximation: RAM implementations
15
Extensions to Low-rank Approximation (Projections)
Description:
Explore the foundations of randomized numerical linear algebra in this lecture from the Foundations of Data Science Boot Camp. Delve into key concepts like element-wise sampling, row/column sampling, and random projections as preconditioners. Learn about approximating matrix multiplication, subspace embeddings, and the importance of leverage and condition in algorithms. Examine meta-algorithms for E-norm and Iz-norm regression, and discover structural results for least-squares approximation. Gain insights into RAM implementations and extensions to low-rank approximation using projections. Presented by Michael Mahoney from the International Computer Science Institute and UC Berkeley, this comprehensive talk provides a deep dive into sampling techniques for linear algebra, statistics, and optimization.
Sampling for Linear Algebra, Statistics, and Optimization I