Главная
Study mode:
on
1
Introduction
2
Supervised Learning
3
Multilayer Perception
4
Why Deep Networks
5
Back Propagation
6
Quadratic Loss
7
Historical Perspective
8
Convolution
9
Example
10
Nonlinearity
11
Pooling
12
Evolutionary Steps
13
Applications
14
Gradient Explosion
15
Recurrent Unit
16
Training Paradigm
Description:
Dive into the fundamentals of deep learning with this comprehensive tutorial from the MIT BMM Summer Course 2018. Led by Eugenio Piasini and Yen-Ling Kuo, explore key concepts such as supervised learning, multilayer perceptrons, and the rationale behind deep networks. Gain insights into back propagation, quadratic loss, and the historical perspective of neural networks. Delve into convolutional neural networks, understanding nonlinearity, pooling, and their evolutionary steps. Discover various applications of deep learning, address challenges like gradient explosion, and learn about recurrent units and training paradigms. This 68-minute session provides a solid foundation for understanding and implementing deep learning techniques.

Deep Learning Tutorial

MITCBMM
Add to list