Главная
Study mode:
on
1
Intro
2
Euclidean k-means and k-medians
3
k-means under dimension reduction
4
k-medians under dimension reduction
5
Plan
6
Out result for k-means
7
Challenges
8
Warm-up
9
Problem & Notation
10
Distortion graph
11
Cost of a cluster
12
Everywhere-sparse edges
13
(1-0) non-distorted core
14
All clusters are large
15
Main Combinatorial lemma
16
Edges Incident on Outliers
17
Summary
Description:
Explore the intricacies of k-means and k-medians clustering algorithms under dimension reduction in this 42-minute lecture by Yury Makarychev from Toyota Technological Institute at Chicago. Delve into the Euclidean k-means and k-medians concepts, examining their behavior under dimension reduction. Investigate the challenges, warm-up exercises, and problem notation associated with these clustering techniques. Analyze the distortion graph, cost of clusters, and the concept of everywhere-sparse edges. Understand the (1-0) non-distorted core and the importance of large clusters. Examine the main combinatorial lemma and edges incident on outliers. Gain valuable insights into robust and high-dimensional statistics through this comprehensive talk presented at the Simons Institute.

K-Means and K-Medians Under Dimension Reduction

Simons Institute
Add to list