Главная
Study mode:
on
1
- Introduction
2
- Adding APIM into the mix
3
- LLM supported
4
- Authentication to the LLM
5
- Adding LLM to APIM
6
- Azure Portal onboarding experience
7
- Some key GenAI capabilities in APIM
8
- Token limits
9
- Emit token metrics
10
- Load balancing and switching between instances
11
- Logging prompts
12
- Semantic caching
13
- Requirements
14
- How it works
15
- Demo
16
- Chaining?
17
- Great example scenario repo
18
- Summary
19
- Close
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Explore Azure API Management integration with generative AI models in this comprehensive 39-minute video tutorial. Learn how to incorporate APIM into your AI workflow, understand supported language models, and master authentication processes. Discover key generative AI capabilities in APIM, including token limit management, metric emission, and load balancing between instances. Dive into advanced topics such as prompt logging and semantic caching, with a focus on requirements and practical implementation. Watch a hands-on demonstration, explore potential for chaining operations, and review an excellent example scenario repository. Gain valuable insights into optimizing your AI-powered API management strategy with Azure.

Azure API Management with Generative AI - Integrating LLMs and Advanced Features

John Savill's Technical Training
Add to list
0:00 / 0:00