Главная
Study mode:
on
1
Welcome and Link to Colab Notebook
2
Encoder versus Decoder Architectures
3
What is the GPT-4o architecture?
4
Recap of transformer for weather prediction
5
Pre layer norm versus post layer norm
6
RoPE vs Sinusoidal Positional Embeddings
7
Dummy Data Generation
8
Transformer Architecture Initialisation
9
Forward pass test
10
Training loop setup and test on dummy data
11
Weather data import
12
Training and Results Visualisation
13
Can the model predict the weather?
14
Is volatility in the loss graph a problem?
15
How to improve the model further?
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Dive into the second part of a comprehensive video series on building transformers from scratch. Explore the differences between encoder and decoder architectures, understand the GPT-4o architecture, and revisit the transformer model for weather prediction. Learn about pre-layer norm versus post-layer norm, and compare RoPE with sinusoidal positional embeddings. Follow along as dummy data is generated, the transformer architecture is initialized, and a forward pass test is conducted. Set up and test a training loop on dummy data before importing real weather data. Visualize training results and evaluate the model's weather prediction capabilities. Discuss the implications of loss graph volatility and explore strategies for further model improvement. Access additional resources and a Colab notebook to enhance your learning experience.

Transformers from Scratch - Part 2: Building and Training a Weather Prediction Model

Trelis Research
Add to list
0:00 / 0:00