Главная
Study mode:
on
1
Introduction
2
Data Representation
3
Derivatives
4
Reshape
5
Back propagation
6
Creating an NLP
7
Implementing backpropagation
8
Testing backpropagation
9
Implementing gradient descent
10
Applying gradient descent
11
Printing weights
12
Testing
13
Gradient Descent
14
Train
15
Train MLP
Description:
Dive into a comprehensive video tutorial on implementing backpropagation and gradient descent from scratch using Python. Learn how to train a neural network to perform arithmetic sum operations. Explore key concepts including data representation, derivatives, reshaping, and the creation of a Natural Language Processing (NLP) model. Follow along as the instructor demonstrates the implementation of backpropagation, testing procedures, and the application of gradient descent. Gain hands-on experience in training a Multilayer Perceptron (MLP) and understand the intricacies of neural network training. Access the accompanying code on GitHub for further practice and experimentation.

Training a Neural Network - Implementing Backpropagation and Gradient Descent from Scratch

Valerio Velardo - The Sound of AI
Add to list