Главная
Study mode:
on
1
- Video Starts
2
- Experimentation Steps
3
- Bloom @ Hugging Face
4
- Testing GPT-2
5
- Testing Neo-GPT-2
6
- Setting Google Colab
7
- Downloading Bloom Model
8
- Setting Text Prompt
9
- Tokenize Text
10
- Applying Bloom Model
11
- GPU/CPU Mixup Error
12
- Correct Model Error
13
- GPU Memory Exception
14
- Encoded Result Output
15
- Decoding Results
16
- Result as Text
17
- Export Notebook to GitHub
Description:
Dive into a hands-on programming tutorial that guides you through the step-by-step implementation of the Bloom Large Language Text Generation Model. Build your own text generation application in Google Colab while learning to troubleshoot common issues such as GPU/CPU device conflicts, attribute errors, and GPU memory exceptions. Follow along as you set up the environment, download the Bloom model, tokenize text, and apply the model to generate results. Gain practical experience with transformers, machine learning, and deep learning concepts while working with cutting-edge natural language processing technology. By the end of this 19-minute video, you'll have created a functional text generation application and learned valuable problem-solving skills for working with large language models.

Bloom - Text Generation Large Language Model - LLM: Step by Step Implementation

Prodramp
Add to list
0:00 / 0:00