Building agents is difficult, but it can be easy with Kong AI Gateway. Kong AI Gateway was the first enterprise-grade AI infrastructure product. It ships with more than 80 capabilities for AI and helps with managing traffic across models and APIs in the ecosystem of the MCP tools that help make the agents successful.
Our customers have been using Kong AI Gateway for three main reasons.
- To develop and nurture a vast MCP/API ecosystem
- To build secure LLM infrastructure to consume any model that you want to use for your agentic capability
- To continuously monitor and observe — to capture the full observability and the metrics and the traces the LLMs, agents, and MCP tools are generating across the board
While 95% may be seeing their agentic initiatives struggle to take off, one of our customers, Prudential, is among the 5% building and using agentic AI successfully.
Prudential has been increasing developer productivity, reducing duplication and waste, and modernizing legacy applications — including many that may have been left over from previous transformation pushes because they were too complex — with the help of agentic AI.
"We're leaning all in on agentic code development," said Elizabeth Brand, VP, Global Head of Cloud, Prudential. "How can we refactor our legacy apps? What are the apps we haven't looked at in years? And what do we not know about them? How can we use the technology to transform our thinking about that application? And how can we decompose that application into smaller components?"
"There's a significant opportunity for us to reduce the amount of complexity in our environment, all while taking advantage and transforming the applications we didn't get to because they were hard — they were complex," Brand said. "Now we have the tools to take on these difficult problems."
What’s Kong MCP Gateway?
As Augusto Marietti, CEO and Co-founder of Kong, likes to put it, “MCP is Duolingo for APIs. It makes them speak English.” Another good metaphor for MCP is that it's the USB-C for AI that effectively helps our LLMs and agents connect to third-party systems, services, and data.
While we can use traditional APIs to build agents, MCP has emerged as a new protocol to help build agents in a much better way. Here’s why:
- Understood by agents: MCP is natively understood by agents. MCP is for AI what REST was for APIs.
- Irrelevant versioning: MCP makes versioning less relevant in the sense that our models are better at handling unstructured data. This makes versioning less challenging than it used to be in the traditional RESTful API space.
- Real-time by default: When we're building agents, we want to build compelling experiences, and it's important that real-time experiences are baked into the underlying protocols.
That said, MCP is not without its challenges.
- High R&D cost: MCP is a new and emerging protocol, which requires high research and development costs..
- Security: Just like API security, MCP tools and servers need to be secured in sync with the MCP specifications.
- Slow to build: Finally, the velocity that it takes to build MCP servers and the ecosystem is slow. In contrast, we need to be quick to capture these agentic AI opportunities.
Considering this, Kong introduced Kong AI Gateway 3.12 with a new MCP Gateway capability, in addition to all other capabilities that have already been there inside the product.