Главная
Study mode:
on
1
- Introduction
2
- XGen Model
3
- Pre-training Data
4
- Training Methods
5
- Evaluation Results
6
- HuggingFace Repository
7
- Google Colab Setup
8
- Prompting XGen
9
- Writing Jokes
10
- Investing Advice
11
- Coding
12
- QA over Text
13
- Conclusion
Description:
Explore the capabilities of XGen-7B, an open-source large language model with 8K token input capacity, in this comprehensive video tutorial. Learn about the model's architecture, pre-training data, and training methods. Examine its performance on NLP benchmarks, long sequence modeling tasks, and code generation. Follow along as the instructor demonstrates how to load the instruction model in Google Colab and test its abilities through various prompts, including answering questions, generating code, and comprehending documents. Gain insights into the model's strengths and limitations in areas such as joke writing, investment advice, and question-answering over text.

Long Sequence Modeling with up to 8K Tokens - Overview, Dataset & Google Colab Code

Venelin Valkov
Add to list
0:00 / 0:00