Главная
Study mode:
on
1
- Intro
2
- Why not use the CLI?
3
- Looking at the ollama-py library
4
- Setting up Python environment
5
- Reviewing Ollama functions
6
- list
7
- show
8
- chat
9
- Looking at Streamlit
10
- Start writing our app
11
- App: user input
12
- App: message history
13
- App: adding ollama response
14
- App: chooing a model
15
- Introducing generators
16
- App: streaming responses
17
- App: review
18
- Where to find the code
19
- Thank you for 2k
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Learn how to build a chat application using Python, Ollama-py, and Streamlit in this 22-minute tutorial. Explore the Ollama Python library's key methods, including list(), show(), and chat(). Set up a Python environment and dive into Streamlit for creating the user interface. Follow along as the instructor guides you through constructing the chat app step-by-step, covering user input, message history, Ollama responses, model selection, and streaming responses. Gain insights into Python generators and their application in the project. By the end, you'll have created a functional LLM chat app with a user-friendly interface.

Building a Chat App with Ollama-py and Streamlit in Python

Decoder
Add to list