Главная
Study mode:
on
1
​ - Introduction
2
​ - Course information
3
​ - Why deep learning?
4
​ - The perceptron
5
​ - Perceptron example
6
​ - From perceptrons to neural networks
7
​ - Applying neural networks
8
​ - Loss functions
9
​ - Training and gradient descent
10
​ - Backpropagation
11
​ - Setting the learning rate
12
​ - Batched gradient descent
13
​ - Regularization: dropout and early stopping
14
​ - Summary
Description:
Dive into the foundations of deep learning with this comprehensive lecture from MIT's Introduction to Deep Learning course. Explore key concepts including perceptrons, neural networks, loss functions, gradient descent, backpropagation, and regularization techniques. Learn why deep learning is revolutionizing various fields and how to apply neural networks to real-world problems. Gain insights into crucial aspects of training neural networks, such as setting learning rates and implementing batched gradient descent. By the end of this 58-minute session, acquire a solid understanding of the fundamental principles driving modern deep learning applications.

MIT Introduction to Deep Learning

Alexander Amini
Add to list