Главная
Study mode:
on
1
Intro
2
Nearest neighbor
3
A nonparametric estimator
4
The data space
5
Statistical learning theory setup
6
Questions of interest
7
Consistency results under continuity
8
Universal consistency in RP
9
A key geometric fact
10
Universal consistency in metric spaces
11
Smoothness and margin conditions
12
A better smoothness condition for NN
13
Accurate rates of convergence under smoothness
14
Under the hood
15
Tradeoffs in choosing k
16
An adaptive NN classifier
17
A nonparametric notion of margin
18
Open problems
Description:
Explore the convergence of nearest neighbor classification in this 49-minute Members' Seminar presented by Sanjoy Dasgupta from the University of California, San Diego. Delve into the nonparametric estimator, statistical learning theory setup, and consistency results under continuity. Examine universal consistency in RPA and metric spaces, smoothness and margin conditions, and accurate rates of convergence. Investigate tradeoffs in choosing k, adaptive NN classifiers, and nonparametric notions of margin. Conclude with open problems in the field of nearest neighbor classification.

Convergence of Nearest Neighbor Classification - Sanjoy Dasgupta

Institute for Advanced Study
Add to list