Build a Chatbot with Kubernetes & Akamai Cloud
Автор: Akamai Developer
Загружено: 2025-10-21
Просмотров: 546
Curious how to deploy your own AI chatbot with real-time inference, Kubernetes, and GPUs? Read the Docs to learn more.
You don’t need to be a cloud expert to follow along. In this video, Mike Ellison, Senior Developer Advocate for Akamai, walks through how he built and deployed his first LLM-powered chatbot using Akamai Cloud and the LKE. From managing GPUs to configuring serverless workloads with KServe, Hugging Face, and Knative, Mike shares his process and the tools that made the biggest impact.
This Video Discusses:
Using App Platform to deploy full-stack AI apps
Leveraging LKE with NVIDIA GPUs for inference
Integrating with Hugging Face and Llama 3
Simplifying DNS, SSL, and CI/CD with one-click infrastructure
Real-world experience getting an AI chatbot running in just over an hour.
Read the Docs: https://ow.ly/JlfL50VORn8
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: