Главная
Study mode:
on
1
intro
2
preamble
3
tim spann
4
rapid innovation in the llm space
5
enterprise wide use cases for an llm
6
which model and when?
7
ecosystem partnerships
8
cloudera generative ai stack
9
cloudera + llm
10
dataflow pipelines can help
11
unstructured data with nifi
12
cloud ml/dl/ai/vector database services
13
nifi 2.0.0 features
14
flink sql - cloudera machine learning models
15
flink sql - nifi - hugging face google gemini
16
ssb udf js/java + genai = real-time genai sql
17
gemma
18
python processors
19
more articles
20
demo
21
thank you
Description:
Explore how to integrate generative AI into real-time streaming pipelines in this conference talk from Conf42 LLMs 2024. Dive into the rapid innovation occurring in the LLM space and discover enterprise-wide use cases for AI models. Learn about ecosystem partnerships and the Cloudera generative AI stack. Understand how dataflow pipelines can assist with unstructured data processing using NiFi. Examine cloud ML/DL/AI/vector database services and new features in NiFi 2.0.0. Investigate Flink SQL integration with Cloudera machine learning models, Hugging Face, and Google Gemini. Explore real-time generative AI SQL using SSB UDF JS/Java and the potential of Gemma. Gain insights into Python processors and watch a comprehensive demo showcasing the practical application of these concepts in streaming pipelines.

Adding Generative AI to Real-Time Streaming Pipelines - Conf42 LLMs 2024

Conf42
Add to list
0:00 / 0:00