Главная
Study mode:
on
1
Intro
2
for classification
3
Choice of k matters
4
Feature scaling matters
5
Distance metric matters
6
K-NN with categorical data
7
for regression
8
Expensive to compute with large data sets. Sensitive to feature scaling. Sensitive to distance metric.
Description:
Explore the fundamentals of the k-nearest neighbors algorithm in this 26-minute video lecture from the End to End Machine Learning School. Discover how k-NN works for both classification and regression tasks, understand the importance of choosing the right k value, and learn about feature scaling and distance metrics. Delve into the application of k-NN with categorical data and examine its limitations, including computational expense with large datasets and sensitivity to feature scaling and distance metrics.

How K-Nearest Neighbors Works

Brandon Rohrer
Add to list
0:00 / 0:00