New providers, smarter routing, stronger guardrails — because AI infrastructure should be as robust as APIs
We know that successful AI connectivity programs often start with an intense focus on how you govern and protect LLM and MCP traffic. This is why Kong was first to market with an enterprise-grade AI gateway, and it’s why, in our last release, we added enterprise MCP proxy support to our already-existing LLM proxy in the Kong AI Gateway.
And we’re still innovating.
Today, we're proud to announce one of the most significant updates to Kong AI Gateway yet. This release is all about helping you move from “we’re experimenting with LLM and MCP-powered workflows” to “let’s ship our first production-grade agent to market,” with a special focus on the hard problems to solve around agentic security, dev productivity, and resilience.
Kong AI Gateway 3.13 unlocks a wave of new capabilities that power enterprise-grade AI infrastructures — from MCP tool-level access control and circuit breakers to expanded provider support, advanced generation workflows, and load balancing improvements. If you’re building multi-model, agentic, real-time, or mission-critical AI systems, you’ll want to upgrade.
The full changelog can be found here.