talk-data.com
Microsoft Ignite
breakout
2025-11-21 at 18:15
Fast and flexible inference on open-source AI models at scale
Event:
Microsoft Ignite 2025
Topics
Description
Run open-source AI models of your choice with flexibility—from local environments to cloud deployments using Azure Container Apps and serverless GPUs for fast, cost-efficient inferencing. You will also learn how AKS powers scalable, high-performance LLM operations with fine-tuned control, giving you confidence to deploy your models your way. You’ll leave with a clear path to run custom and OSS models with agility and cost clarity.