Главная
Study mode:
on
1
Intro
2
The Rise of Deep Learning
3
What is Deep Learning?
4
Lecture Schedule
5
Final Class Project
6
Class Support
7
Course Staff
8
Why Deep Learning
9
The Perceptron: Forward Propagation
10
Common Activation Functions
11
Importance of Activation Functions
12
The Perceptron: Example
13
The Perceptron: Simplified
14
Multi Output Perceptron
15
Single Layer Neural Network
16
Deep Neural Network
17
Quantifying Loss
18
Empirical Loss
19
Binary Cross Entropy Loss
20
Mean Squared Error Loss
21
Loss Optimization
22
Computing Gradients: Backpropagation
23
Training Neural Networks is Difficult
24
Setting the Learning Rate
25
Adaptive Learning Rates
26
Adaptive Learning Rate Algorithms
27
Stochastic Gradient Descent
28
Mini-batches while training
29
The Problem of Overfitting
30
Regularization 1: Dropout
31
Regularization 2: Early Stopping
32
Core Foundation Review
Description:
Explore the foundations of deep learning in this introductory lecture from MIT's 6.S191 course. Delve into the rise of deep learning, understand its significance, and learn about perceptrons, neural networks, and activation functions. Discover how to quantify loss, optimize algorithms, and tackle challenges in training neural networks. Gain insights into adaptive learning rates, stochastic gradient descent, and strategies to prevent overfitting. Master the core concepts of deep learning to build a strong foundation for advanced applications in artificial intelligence.

MIT: Introduction to Deep Learning

Alexander Amini
Add to list