Главная
Study mode:
on
1
intro
2
preamble
3
agenda
4
what is operation excellence?
5
what are the design principles?
6
devops v/s mlops v/s llmops
7
people, process and technology
8
model lifecycle
9
llmops can be different for each type of users
10
llm selection criteria
11
comparison of llm customizations
12
customizing model responses for your business
13
amazon bedrock
14
knowledge bases for amazon bedrock
15
privately customize models with your data
16
amazon bedrock api with amazon api gateway
17
amazon api gateway models
18
aws lambda invokin amazon bedrock api
19
amazon bedrock api from a generic application
20
using aws sdk
21
invocation logging
22
metrics
23
model evaluation
24
building generative apps brings new challenges
25
using nvidia/nemo-guardrails
26
amazon bedrock examples
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Explore operational excellence for Large Language Models (LLMs) using Amazon Bedrock in this conference talk by Suraj Muraleedharan at Conf42 LLMs 2024. Delve into the design principles of operational excellence, compare DevOps, MLOps, and LLMOps, and understand the model lifecycle. Learn about LLM selection criteria, customization options, and architectural patterns for Amazon Bedrock. Discover how to implement knowledge bases, privately customize models, and integrate Amazon Bedrock API with various AWS services. Gain insights on invocation logging, metrics, model evaluation, and implementing guardrails for generative applications. Examine real-world examples and best practices for achieving operational excellence with Amazon Bedrock in LLM deployments.

Operational Excellence for LLMs Using Amazon Bedrock - Conf42 LLMs 2024

Conf42
Add to list
0:00 / 0:00