Главная
Study mode:
on
1
- Fine tuning GPT3 for fiction
2
- Generating fiction from a single story premise
3
- GPT3 copying a style verbatim
4
- Writing a novel: process, what works, and what doesn't
5
- Creating a fan fiction generator
6
- Generating fan fiction with a machine
7
- Training a machine to write a story
8
- Frankenstein and Alice in Wonderland Outlines
9
- Generating a story outline with Auto Muse
10
- The difference between working memory and recall
11
- The need for a task set in GPT3
12
- Generating a live summary of a story
13
- Preprocessing text data for GPT3
14
- Generating a novel with Auto Muse
15
- Writing a novel one paragraph at a time
16
- The increasing length of the story
17
- The process of creating a machine that can summarize a novel
18
- Building a story one chunk at a time
19
- The length of each summary chunk
Description:
Explore the process of fine-tuning GPT-3 to write an entire coherent novel in this 41-minute video. Learn about generating fiction from a single story premise, copying writing styles, and creating fan fiction generators. Discover techniques for training AI to write stories, generate outlines, and summarize content in real-time. Delve into the intricacies of working memory, task sets, and preprocessing text data for GPT-3. Gain insights on writing novels paragraph by paragraph, managing increasing story lengths, and building narratives chunk by chunk. This comprehensive tutorial covers various aspects of AI-powered novel writing, providing valuable knowledge for those interested in leveraging machine learning for creative writing.

Fine-Tune GPT-3 to Write an Entire Coherent Novel - Part 1

David Shapiro ~ AI
Add to list
0:00 / 0:00