Главная
Study mode:
on
1
Intro
2
Download and place ControlNet 1.1 models in proper directory
3
Segment Anything extension
4
Install visual studio build tools if you have any errors regarding pycoco tools
5
Generate baseline reference using traditional merged inpainting model
6
Using Grounding DINO to create a semi-supervised inpaint mask
7
Enable ControlNet 1.1 inpaint global harmonious
8
ControlNet 1.1 inpainting gotcha #1
9
ControlNet 1.1 gotcha #2
10
Tuning the inapinting parameters
11
Analyzing the new tuned outputs
12
Compositing ControlNet 1.1 inpaint output in photoshop
13
ControlNet 1.1 inpaint without Grounding DINO
14
Exploring ControlNet 1.1 instruct pix2pix for targeted variations
15
Determining the limitations for ip2p
16
Using segment anything with ip2p
17
Applying ip2p + Grounding DINO to PNGtuber
18
Analyzing the tuned PNGtuber results
19
ControlNet 1.1 Tile model overview
20
Applying the tile model to the shipbuilder illustration
21
Showing the thumbnail tile model generation
22
Introducing the image that will be used with tile model contextual upscaling
23
Checking Github issue for more information regarding tile model
24
Contextual upscaling with ControlNet 1.1 tile model
25
Comparing upscaler methods tile model, vanilla Ultimate SD Upscale, 4x Ultrasharp
26
Use tile model upscale on the star pupils chibi
27
Composite upscaled closed mouth expression
28
Creating the closed eyes expression
29
Closing thoughts
Description:
Dive into an in-depth 57-minute tutorial on applying ControlNet 1.1 to character design, focusing on Stable Diffusion techniques. Learn how to set up ControlNet 1.1 models, use the Segment Anything extension, and troubleshoot common issues. Explore advanced inpainting techniques using Grounding DINO and global harmonious inpainting. Discover the potential of instruct pix2pix for targeted variations and its application to PNGtuber creation. Gain insights into the ControlNet 1.1 Tile model for contextual upscaling and compare various upscaling methods. Master the art of compositing and expression creation for character designs. Perfect for digital artists, concept artists, and AI art enthusiasts looking to enhance their character design workflow using cutting-edge machine learning tools.

Stable Diffusion- Applying ControlNet to Character Design - Part 2

kasukanra
Add to list
0:00 / 0:00