Running Ollama on Azure Container Apps Part 2 of "Running LLMs & Agents on Azure Container Apps" In Part 1, I made the case for why Azure Container Apps hits the sweet spot for self-hosted LLM inference. Now let's actually build it. By the end of this post, you'll have Ollama running in Azure, serving Llama 3, with persistent model storage and a secure endpoint. The basic deployment takes ab