Главная
Study mode:
on
1
Introduction
2
Context for continual learning
3
Training on new data
4
Catastrophic forgetting
5
Training from scratch
6
Replaying training data
7
Evaluating continual learning
8
Incremental class learning
9
Multimodal class learning
10
Strategies for continual learning
11
Regularization approaches
12
Learning without forgetting
13
Regularization
14
Elastic Weight Consolidation
15
Bayesian Learning Perspective
16
Progressive Neural Networks
17
generative replay
18
complementary learning systems
Description:
Explore continual learning and catastrophic forgetting in deep neural networks through this 42-minute lecture. Delve into the context, evaluation methods, and algorithms based on regularization, dynamic architectures, and Complementary Learning Systems. Examine data permutation tasks, incremental task learning, multimodal learning, Learning without Forgetting algorithm, Elastic Weight Consolidation, Progressive Neural Networks, and Generative replay. Gain insights from Northeastern University's CS 7150 Deep Learning course, with references to key research papers in the field. Access accompanying lecture notes for a comprehensive understanding of this crucial topic in machine learning.

Continual Learning and Catastrophic Forgetting

Paul Hand
Add to list
0:00 / 0:00