Главная
Study mode:
on
1
Recording starts
2
Announcements
3
Random projection motivation recap
4
Random projection algorithm recap
5
Choosing random unit vectors
6
Random projection vs. PCA/SVD
7
Dimensionality reduction so far
8
Frequent directions motivation
9
Frequent items / Misra-Gries reminder
10
Frequent directions algorithm
11
Linformer an example of using SVD and random projection
12
Lecture ends
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Learn about dimensionality reduction techniques in this university lecture that explores random projections, PCA/SVD, and frequent directions algorithms. Begin with a thorough recap of random projection motivation and algorithmic implementation, including methods for choosing random unit vectors. Compare and contrast random projection with PCA/SVD approaches, before diving into an in-depth discussion of frequent directions. Understand the Misra-Gries algorithm for frequent items and its relationship to dimensionality reduction. Conclude with a practical example examining Linformer, which demonstrates the real-world application of SVD and random projection techniques in modern machine learning architectures.

Data Mining: Frequent Directions and Random Projections - Spring 2023

UofU Data Science
Add to list
0:00 / 0:00