Главная
Study mode:
on
1
Introduction
2
Probability
3
Classification
4
Regression vs Classification
5
Ensemble Learning
6
Benefits of Ensemble Learning
7
Independent Classifiers
8
Pros Cons
9
Randomness
10
When does bagging work
11
Boosting
12
Strong vs Weak Learners
13
Basic Algorithm Training
14
Weighted Vote
15
normalizing constant
16
AdaBoost
17
Strong NonLinear Classifier
18
Decision Stumps
Description:
Dive into a comprehensive 44-minute lecture on Logistic Regression and Ensemble Learning techniques, focusing on Bagging, Boosting, and AdaBoost. Explore the fundamentals of probability, classification, and the differences between regression and classification. Gain insights into Ensemble Learning methods, understanding their benefits and applications. Examine independent classifiers, their pros and cons, and the role of randomness in bagging. Discover when bagging is most effective and delve into boosting techniques, comparing strong and weak learners. Learn about the basic algorithm training process, weighted voting, and normalizing constants. Conclude with an in-depth look at AdaBoost and its application as a strong non-linear classifier using Decision Stumps. This lecture is part of a broader Artificial Intelligence and Machine Learning course, suitable for those with programming knowledge or experience with AI and ML tools.

Logistic Regression and Ensemble Learning - Bagging and Boosting - AdaBoost

Software Engineering Courses - SE Courses
Add to list
0:00 / 0:00