• Explore the unified API Platform
        • BUILD APIs
        • Kong Insomnia
        • API Design
        • API Mocking
        • API Testing & Debugging
        • MCP Client
        • RUN APIs
        • API Gateway
        • Context Mesh
        • AI Gateway
        • Event Gateway
        • Kubernetes Operator
        • Service Mesh
        • Ingress Controller
        • Runtime Management
        • DISCOVER APIs
        • Developer Portal
        • Service Catalog
        • MCP Registry
        • GOVERN APIs
        • Metering & Billing
        • APIOps & Automation
        • API Observability
        • Why Kong?
      • CLOUD
      • Cloud API Gateways
      • Need a self-hosted or hybrid option?
      • COMPARE
      • Considering AI Gateway alternatives?
      • Kong vs. Postman
      • Kong vs. MuleSoft
      • Kong vs. Apigee
      • Kong vs. IBM
      • GET STARTED
      • Sign Up for Kong Konnect
      • Documentation
  • Agents
      • FOR PLATFORM TEAMS
      • Developer Platform
      • Kubernetes & Microservices
      • Observability
      • Service Mesh Connectivity
      • Kafka Event Streaming
      • FOR EXECUTIVES
      • AI Connectivity
      • Open Banking
      • Legacy Migration
      • Platform Cost Reduction
      • Kafka Cost Optimization
      • API Monetization
      • AI Monetization
      • AI FinOps
      • FOR AI TEAMS
      • AI Cost Control
      • AI Governance
      • AI Integration
      • AI Security
      • Agentic Infrastructure
      • MCP Production
      • MCP Traffic Gateway
      • FOR DEVELOPERS
      • Mobile App API Development
      • GenAI App Development
      • API Gateway for Istio
      • Decentralized Load Balancing
      • BY INDUSTRY
      • Financial Services
      • Healthcare
      • Higher Education
      • Insurance
      • Manufacturing
      • Retail
      • Software & Technology
      • Transportation
      • See all Solutions
      • DOCUMENTATION
      • Kong Konnect
      • Kong Gateway
      • Kong Mesh
      • Kong AI Gateway
      • Kong Insomnia
      • Plugin Hub
      • EXPLORE
      • Blog
      • Learning Center
      • eBooks
      • Reports
      • Demos
      • Customer Stories
      • Videos
      • EVENTS
      • AI + API Summit
      • Webinars
      • User Calls
      • Workshops
      • Meetups
      • See All Events
      • FOR DEVELOPERS
      • Get Started
      • Community
      • Certification
      • Training
      • COMPANY
      • About Us
      • Why Kong?
      • We're Hiring!
      • Press Room
      • Investors
      • Contact Us
      • PARTNER
      • Kong Partner Program
      • SECURITY
      • Trust and Compliance
      • SUPPORT
      • Enterprise Support Portal
      • Professional Services
      • Documentation
      • Press Releases

        Kong Names Bruce Felt as Chief Financial Officer

        Read More
  • Pricing
  • Login
  • Get a Demo
  • Start for Free
Blog
  • AI Gateway
  • AI Security
  • AIOps
  • API Security
  • API Gateway
|
    • API Management
    • API Development
    • API Design
    • Automation
    • Service Mesh
    • Insomnia
    • View All Blogs
  1. Home
  2. Blog
  3. Product Releases
  4. Move More Agentic Workloads to Production with AI Gateway 3.13
Product Releases
December 18, 2025
5 min read

Move More Agentic Workloads to Production with AI Gateway 3.13

MCP ACLs, Claude Code Support, and New Guardrails

Greg Peranich
Staff Product Manager, Kong

Kong AI Gateway 3.13 moves enterprises from AI experimentation to shipping production-grade agents by unlocking new capabilities focused on agentic security, developer productivity, and resilience, including MCP tool-level access control, expanded provider support, and smarter load balancing.

New providers, smarter routing, stronger guardrails — because AI infrastructure should be as robust as APIs

We know that successful AI connectivity programs often start with an intense focus on how you govern and protect LLM and MCP traffic. This is why Kong was first to market with an enterprise-grade AI gateway, and it’s why, in our last release, we added enterprise MCP proxy support to our already-existing LLM proxy in the Kong AI Gateway.

And we’re still innovating.  

Today, we're proud to announce one of the most significant updates to Kong AI Gateway yet. This release is all about helping you move from “we’re experimenting with LLM and MCP-powered workflows” to “let’s ship our first production-grade agent to market,” with a special focus on the hard problems to solve around agentic security, dev productivity, and resilience. 

Kong AI Gateway 3.13 unlocks a wave of new capabilities that power enterprise-grade AI infrastructures — from MCP tool-level access control and circuit breakers to expanded provider support, advanced generation workflows, and load balancing improvements. If you’re building multi-model, agentic, real-time, or mission-critical AI systems, you’ll want to upgrade.

The full changelog can be found here.

🔐 MCP ACLs: Fine-grained tool and access control

With the rise of agent architectures and multi-tool workflows, controlling what tools an agent (or user) can call — and when — is essential. Since the rise of MCP, organizations (and often different teams within organizations) have approached this problem in different, often non-standardized ways. Kong AI Gateway solves for this, giving AI and Infra teams the ability to standardize how their org:

  • Filters MCP tools: Restrict which tools or capabilities are exposed to an agent or consumer by default.
  • Enforces access control at the MCP layer: Use ACLs to enforce least-privilege access across MCP servers.
  • Dynamically manages ACL configuration: ACLs can be managed via declarative config (e.g., decK / Terraform) or via the control plane.

Let’s consider exposing an MCP interface for fictitious airline KongAir's API. In the example configuration below, we can ensure that only users in the booking-agents consumer group will be allowed to consume KongAir MCP tools by default. Users that have been mapped to the developers consumer group will only be permitted access to the tools that allow flight details to be fetched. 

plugins:
- name: ai-mcp-proxy
  config:
    mode: conversion-listener
    logging:
      log_payloads: true
      log_statistics: true
    consumer_identifier: username
    default_acl:
    - allow:
      - booking-agents
      deny:
      - developers
      scope: tools
    include_consumer_groups: true
    tools:
    - description: Get KongAir planned flights
      annotations:
        title: Get KongAir planned flights
      method: GET
      path: /flights
      parameters:
        - name: date
          in: query
          description: Filter by date (defaults to current day)
          required: false
          schema:
            type: string
      acl:
        allow:
        - developers
    - description: Get a specific flight by flight number
      annotations:
        title: Get a specific flight by flight number
      method: GET
      path: /flights/{flightNumber}
      parameters:
        - name: flightNumber
          in: path
          description: The flight number
          required: true
          schema:
            type: string
      acl:
        allow:
        - developers
    - description: Fetch more details about a flight
      annotations:
        title: Fetch more details about a flight
      method: GET
      path: /flights/{flightNumber}/details
      parameters:
        - name: flightNumber
          in: path
          description: The flight number
          required: true
          schema:
            type: string
      acl:
        allow:
        - developers
    - description: Book a flight
      annotations:
        title: Book a flight
      method: POST
      path: /flights/{flightNumber}/bookings
      parameters:
        - name: flightNumber
          in: path
          description: The flight number to book
          required: true
          schema:
            type: string
      request_body:
        required: true
        content:
          application/json:
            schema:
              type: object
              properties:
                passenger_name:
                  type: string
                passenger_email:
                  type: string
                  format: email
                seat_preference:
                  type: string
                  enum: [window, aisle, middle]
              required:
                - passenger_name
                - passenger_email
    - description: Delete a flight booking
      annotations:
        title: Delete a flight booking
      method: DELETE
      path: /bookings/{bookingId}
      parameters:
        - name: bookingId
          in: path
          description: The booking ID to delete
          required: true
          schema:
            type: string

    server:
      timeout: 60000

These features help you ensure that AI traffic through the Gateway remains secure, governed, and compliant — even in highly dynamic, multi-agent environments.

🌍 Expanded provider ecosystem

Say goodbye to lock-in. This release adds first-class support for several new providers:

  • Anthropic (native SDK support) — optimized for code generation, developer tooling, and agent-driven code workflows.
  • xAI / Grok — enabling reasoning workloads and inference use-cases at scale.
  • Aliyun (Alibaba Cloud / Qwen) — ideal for deployments in Asia-Pacific with compliance and low-latency needs.
  • Cerebras — optimized large-model inference endpoints.

With these additions, Kong AI Gateway becomes a truly multi-cloud, multi-model AI platform — giving teams flexibility to choose providers based on cost, compliance, performance, or capability.

🔨 Claude Code Support

We're excited to empower organizations to govern a tool that nearly every developer uses, Anthropic’s Claude Code. With one line of configuration, access to Claude can be secured, governed, and observed end-to-end–finally adding that governance layer to Claude-powered dev productivity.

Views the docs here.

🛡️ Guardrails & Safety: Lakera.ai Integration

AI infrastructure isn’t just about performance and scale — safety and compliance matter. That’s why this release brings built-in integration with Lakera.ai guardrails. With AI Gateway 3.13, you can:

  • Enforce content safety, block unsafe or toxic generations, and prevent leakage of PII or sensitive data.
  • Apply guardrails uniformly, regardless of the underlying model provider.
  • Seamless integration: no code changes required — policies enforced at the gateway layer.

Combined with existing plugins (such as prompt guard, PII sanitizer, and semantic caching), this makes Kong AI Gateway ideal for enterprise deployments where safety and compliance are non-negotiable.

Learn how to integrate with Lakera here.

🎛️ Unified advanced generation APIs: Batch, files, multi-modal, real-time

Modern AI workflows are complex. Think: batch inference, file-based generation, multi-modal inputs (text, image, audio), streaming outputs, and long-lived agent loops. This release expands support for:

  • Batch operations — efficient parallel inference for large workloads
  • File-based generation / file uploads — supporting document processing, ingestion, and rich context workflows across every provider that offers file-based operations.
  • Multi-modal support — mixing text, images, audio, whatever your use-case needs
  • Real-time / streaming APIs — we’ve long supported the OpenAI Realtime API, and with this release, we’re adding Gemini Live, expanding real-time and event-driven capabilities for chatbots, live agents, voice assistants, and continuous streaming applications.

All providers (old and new) that support these advanced text generation patterns (old and new) can now benefit from a unified, consistent interface — reducing complexity and speeding up development.

The support matrix can be viewed here.

⚙️ Smarter load balancing & reliability

AI workloads increasingly demand high availability, low-latency scaling, and resilience. This release adds:

  • Native circuit-breaker support — automatically detect and shed traffic from unhealthy or failing upstreams to prevent cascading failures
  • Enhanced semantic load balancing — extended to support classification-group aware failover, so routing maintains semantic correctness when managing fallback or failover logic
  • Least connections algorithm — ideal for long-lived streaming, real-time, or high-concurrency workloads

With these enhancements, Kong AI Gateway strengthens its role as a purpose-built load-balancing and traffic-management layer for GenAI and agent workloads.

⚡ Why this matters

AI systems have moved beyond single-model, one-off requests. What organizations need now is infrastructure that handles:

  • Multi-model strategies (vendor diversification, specialization per provider)
  • Agentic workflows with complex tool interactions
  • Real-time, streaming, multi-modal generation at scale
  • Safety, compliance, and governance — without slowing down innovation
  • Resilience, failover, and performance under load

This release empowers you to build systems that meet — and exceed — those needs.

📦 Get started

Ready to try out the new release of Kong AI Gateway? You can get started for FREE with Konnect Plus. If you already have a Konnect account, visit the official product page or dive straight into the demos and tutorials.

If you want to learn more, check out the updated docs, provider blueprints, and example configurations on the website. Declarative config (decK / Terraform) tooling is already updated with new plugin settings for ACLs, guardrails, load balancing, and provider configuration.

Ready to build safer, smarter, multi-model AI infrastructure at scale? Dive in today and let us know what you build.

AI-powered API security? Yes please!

Learn MoreGet a Demo
AI GatewayMCPAgentic AIAILLM

Table of Contents

  • 🔐 MCP ACLs: Fine-grained tool and access control
  • 🌍 Expanded provider ecosystem
  • 🔨 Claude Code Support
  • 🛡️ Guardrails & Safety: Lakera.ai Integration
  • 🎛️ Unified advanced generation APIs: Batch, files, multi-modal, real-time
  • ⚙️ Smarter load balancing & reliability
  • ⚡ Why this matters
  • 📦 Get started

More on this topic

Reports

Agentic AI in the Enterprise: Adoption, Governance, and Barriers

Demos

Securing Enterprise LLM Deployments: Best Practices and Implementation

See Kong in action

Accelerate deployments, reduce vulnerabilities, and gain real-time visibility. 

Get a Demo
Topics
AI GatewayMCPAgentic AIAILLM
Greg Peranich
Staff Product Manager, Kong

Recommended posts

AI Input vs. Output: Why Token Direction Matters for AI Cost Management

EnterpriseMarch 10, 2026

The Shifting Economic Landscape: The AI token economy in 2026 is evolving, and enterprise leaders must distinguish between low-cost input tokens and high-premium output tokens to maintain profitability. Agentic AI Financial Risks: The transition t

Dan Temkin

Make MCP Production-Ready: Introducing Kong’s Enterprise MCP Gateway

Product ReleasesOctober 14, 2025

What does the solution space look like so far? The solution landscape is complicated by the fact that MCP is still finding its footing, and there are many various OSS projects and vendors that are rapidly shipping “MCP support” in an attempt to take

Alex Drag

Introducing the Volcano SDK to Build AI Agents in a Few Lines of Code

Product ReleasesOctober 14, 2025

Today, we're open-sourcing Volcano SDK, a TypeScript SDK for building AI agents that combines LLM reasoning with real-world actions through MCP tools. Why Volcano SDK? One reason: because 9 lines of code are faster to write and easier to manage tha

Marco Palladino

Building the Agentic AI Developer Platform: A 5-Pillar Framework

EnterpriseJanuary 15, 2026

The first pillar is enablement. Developers need tools that reduce friction when building AI-powered applications and agents. This means providing: Native MCP support for connecting agents to enterprise tools and data sources SDKs and frameworks op

Alex Drag

Kong Konnect: Introducing HashiCorp Vault Support for LLMs

Product ReleasesJune 26, 2025

If you're a builder, you likely keep sending your LLM credentials on every request from your agents and applications. But if you operate in an enterprise environment, you'll want to store your credentials in a secure third-party like HashiCorp Vault

Marco Palladino

Kong AI Manager: Govern & Observe Agentic Traffic to Thousands of LLMs

Product ReleasesMay 27, 2025

Today, we're excited to announce the general availability of AI Manager in Kong Konnect, the platform to manage all of your API, AI, and event connectivity across all modern digital applications and AI agents. Kong already provides the fastest and m

Marco Palladino

Announcing Kong AI Gateway 3.8 With Semantic Caching and Security, 6 New LLM Load-Balancing Algorithms, and More LLMs

Product ReleasesSeptember 11, 2024

Today at API Summit , we're introducing one of the biggest new releases of our AI Gateway technology : a new class of intelligent semantic plugins, new advanced load balancing capabilities for LLMs, and the official support for AWS Bedrock and GCP

Marco Palladino

Ready to see Kong in action?

Get a personalized walkthrough of Kong's platform tailored to your architecture, use cases, and scale requirements.

Get a Demo
Powering the API world

Increase developer productivity, security, and performance at scale with the unified platform for API management, AI gateways, service mesh, and ingress controller.

Sign up for Kong newsletter

    • Platform
    • Kong Konnect
    • Kong Gateway
    • Kong AI Gateway
    • Kong Insomnia
    • Developer Portal
    • Gateway Manager
    • Cloud Gateway
    • Get a Demo
    • Explore More
    • Open Banking API Solutions
    • API Governance Solutions
    • Istio API Gateway Integration
    • Kubernetes API Management
    • API Gateway: Build vs Buy
    • Kong vs Postman
    • Kong vs MuleSoft
    • Kong vs Apigee
    • Documentation
    • Kong Konnect Docs
    • Kong Gateway Docs
    • Kong Mesh Docs
    • Kong AI Gateway
    • Kong Insomnia Docs
    • Kong Plugin Hub
    • Open Source
    • Kong Gateway
    • Kuma
    • Insomnia
    • Kong Community
    • Company
    • About Kong
    • Customers
    • Careers
    • Press
    • Events
    • Contact
    • Pricing
  • Terms
  • Privacy
  • Trust and Compliance
  • © Kong Inc. 2026