Kong Gateway Operator & AI Gateway Workshop: From Traditional Workloads to GenAI
Elevate your infrastructure management with this two-hour workshop series to get hands-on experience with the Kong Gateway Operator and Kong AI Gateway. Begin with mastering Kubernetes ingress automation through the Kong Gateway Operator, then advance to building secure AI infrastructure with Kong AI Gateway. This comprehensive session equips you with practical skills for both traditional application deployment and cutting-edge GenAI implementation—all within the Kong ecosystem.
This workshop is for DevOps engineers, platform architects, and AI developers looking to streamline operations and enhance security across both traditional and AI workloads.
Workshop Schedule:
Workshop 1: Automating Kubernetes Ingress Management with Kong Gateway Operator
Learn how to simplify and automate Kubernetes ingress management using Kong Gateway Operator for efficient application deployment and scaling. From gateway deployment essentials to advanced lifecycle management techniques, you'll learn to optimize your infrastructure, implement automated rollouts, and adapt to changing traffic patterns.
This workshop covers:
Kubernetes Gateway API fundamentals and seamless deployment of multiple ingress gateways
Advanced gateway management, including canary deployments and blue/green rollouts
Auto-scaling, monitoring, and flexible deployment strategies for gateway resources
Workshop 2: Run and Secure LLM Traffic with Kong AI Gateway
Learn how to build and manage robust AI infrastructure using Kong AI Gateway for efficient GenAI application development and deployment. From AI Gateway essentials to advanced management techniques, you'll learn to optimize your applications, implement governance and security measures, and adapt to various deployment environments.
This workshop covers:
AI infrastructure fundamentals and seamless integration of GenAI applications
Advanced AI management, including multi-LLM routing and prompt engineering
Governance, security, and flexible deployment strategies for AI resources