• The API Platform for AI.

      Explore More
      Platform Runtimes
      Kong Gateway
      • Kong Cloud Gateways
      • Kong Ingress Controller
      • Kong Operator
      • Kong Gateway Plugins
      Kong AI Gateway
      Kong Event Gateway
      Kong Mesh
      Platform Core Services
      • Gateway Manager
      • Mesh Manager
      • Service Catalog
      Platform Applications
      • Developer Portal
      • API and AI Analytics
      • API Products
      Development Tools
      Kong Insomnia
      • API Design
      • API Testing and Debugging
      Self-Hosted API Management
      Kong Gateway Enterprise
      Kong Open Source Projects
      • Kong Gateway OSS
      • Kuma
      • Kong Insomnia OSS
      • Kong Community
      Get Started
      • Sign Up for Kong Konnect
      • Documentation
    • Featured
      Open Banking SolutionsMobile Application API DevelopmentBuild a Developer PlatformAPI SecurityAPI GovernanceKafka Event StreamingAI GovernanceAPI Productization
      Industry
      Financial ServicesHealthcareHigher EducationInsuranceManufacturingRetailSoftware & TechnologyTransportation
      Use Case
      API Gateway for IstioBuild on KubernetesDecentralized Load BalancingMonolith to MicroservicesObservabilityPower OpenAI ApplicationsService Mesh ConnectivityZero Trust SecuritySee all Solutions
      Demo

      Learn how to innovate faster while maintaining the highest security standards and customer trust

      Register Now
  • Customers
    • Documentation
      Kong KonnectKong GatewayKong MeshKong AI GatewayKong InsomniaPlugin Hub
      Explore
      BlogLearning CentereBooksReportsDemosCase StudiesVideos
      Events
      API SummitWebinarsUser CallsWorkshopsMeetupsSee All Events
      For Developers
      Get StartedCommunityCertificationTraining
    • Company
      About UsWhy Kong?CareersPress RoomInvestorsContact Us
      Partner
      Kong Partner Program
      Security
      Trust and Compliance
      Support
      Enterprise Support PortalProfessional ServicesDocumentation
      Press Release

      Kong Expands with New Headquarters in Downtown San Francisco

      Read More
  • Pricing
  • Login
  • Get a Demo
  • Start for Free
Blog
  • Engineering
  • Enterprise
  • Learning Center
  • Kong News
  • Product Releases
    • API Gateway
    • Service Mesh
    • Insomnia
    • Kubernetes
    • API Security
    • AI Gateway
  • Home
  • Blog
  • Product Releases
  • Announcing Kong’s New Open Source AI Gateway with Multi-LLM Support, No-Code AI Plugins, Advanced Prompt Engineering, and More
Product Releases
February 15, 2024
6 min read

Announcing Kong’s New Open Source AI Gateway with Multi-LLM Support, No-Code AI Plugins, Advanced Prompt Engineering, and More

Marco Palladino
CTO and Co-Founder

Today I’m excited to announce that Kong has released six new open source AI plugins in Kong Gateway 3.6 that turn every Kong Gateway deployment into an AI Gateway. These new plugins are available today and are entirely free and open source for everyone.

The six new plugins are AI Proxy, AI Request/Response Transformer, AI Prompt Guard, AI Prompt Template, and AI Prompt Decorator.


By simply upgrading your current Kong Gateway to version 3.6 (announced today) you’ll be able to use these new plugins entirely focused on AI and LLM usage. This allows developers who want to integrate one or more LLMs into their products to be more productive and ship AI capabilities faster, while offering a solution to architects and platform teams that ensures visibility, control, and compliance on every AI request sent by the teams. And because it’s built on top of Kong Gateway, it will be possible to orchestrate AI flows in the cloud or on self-hosted LLMs with the best performance and the lowest latency, which are critical in AI-based applications.

All existing 1,000+ Kong Gateway plugins (official and community) are available out of the box on top of your AI traffic — like AuthN/Z, traffic control, rate-limiting, transformations, and more — making Kong's AI Gateway the most capable one in the entire ecosystem. It’s also natively supported by Kong’s cloud platform, Kong Konnect, as well as Kong Gateway Enterprise and Kong Gateway OSS. 

I’ve recorded six short demonstration videos for each one of the new AI plugins, and you can get started for free today.

What can you do with AI Gateway?

With these new AI plugins, you can:

  • Build multi-LLM integrations — The “ai-proxy” plugin allows you to consume multiple LLM implementations — in the cloud or self-hosted — with the same API interface. It ships with native support for OpenAI, Azure AI, Cohere, Anthropic, Mistral, and LLaMA, and because we standardize how they can be used, you can also easily switch between LLMs at the “flip of a switch” without having to change your application code. This is great for using multiple specialized models in your applications and for prototyping.
  • Manage AI credentials centrally — With “ai-proxy” you can also store all AI credentials, including tokens and API keys, in Kong Gateway without having to store them in your applications, so you can easily update and rotate them on the fly and in one centralized place without having to update your code.
  • Collect L7 AI metrics — Using the new “ai-proxy” plugin you can now collect L7 analytics — like the number of request and response tokens or the LLM providers and models used — into any third party like Datadog, New Relic, or any logging plugin that Kong Gateway already supports (such as TCP, Syslog, and Prometheus). By doing this, you not only simplify monitoring of AI traffic by the applications, but you also get insights as to what are the most common LLM technologies that the developers are using in the organization. The L7 AI observability metrics are in addition to all other request and response metrics already collected by Kong Gateway.
  • No-code AI integrations — You can leverage the benefits of AI without having to write any line of code in your applications by using the new “ai-request-transformer” and “ai-response-transformer” plugins that will intercept every API request and response and augment it with any AI prompt that you have configured. For example, you can translate on the fly an existing API response for internationalization without actually having to change the API or change your client applications. You can enrich, transform, and convert all existing API traffic without lifting a finger, and much more. The possibilities are endless. You can even enrich an AI request directed to a specific LLM provider with another LLM provider that will instantly update the request before sending it to the final destination.
  • Advanced AI prompt engineering — We’re shipping three new plugins fully focused on advanced prompt engineering to fundamentally simplify and improve how you’re using AI in your applications. With “ai-prompt-template” we’re introducing an easy way to create prompt templates that can be managed centrally in Kong Gateway and used on the fly by applications by only sending the named values to use in your templates. This way you can update your prompts at a later time without having to update your applications, or even enforce a compliance process for adding a new approved template to the system.
  • Decorate your AI prompts — Most AI prompts that you generate also set a context for what the AI should or should not do and specify rules for interpreting requests and responses. Instead of setting up the context every time, you can centrally configure it using the “ai-prompt-decorator” plugin that will prepend or append your context on the fly on every AI request. This is also useful to ensure compliance in the organization by instructing AI to never discuss — for example — restrictive topics and more.
  • AI prompt firewall — This capability is more oriented towards teams and organizations that want to ensure that prompts are approved and that someone doesn’t mistakenly use the wrong prompts in their applications. With the “ai-prompt-guard” plugin we can set a list of rules to deny or allow free-form prompts that are being generated by the applications and that are being received by Kong Gateway before they’re sent to the LLM providers.
  • Create an AI egress with 1,000+ features — By leveraging these new AI capabilities in Kong Gateway you can centralize how you manage, secure, and observe all your AI traffic from one place. You can also use all the existing 1,000+ official and community plugins of Kong Gateway to further secure how your AI egress should be accessed by other developers (AuthN/Z, mTLS, etc.), add rate-limiting, introduce consumption tiers, or develop sophisticated traffic control rules between one AI provider to another. 

Every AI egress in Kong Gateway is simply a service like any other, so all Kong Gateway features and plugins are available on day one. This makes Kong’s AI Gateway the most capable one in the entire AI ecosystem. Kong Konnect’s developer portal and service catalog are also supported.

With Kong AI Gateway, we can address cross-cutting capabilities that all teams will need to otherwise build themselves.

What’s next

At Kong, we specialize in providing modern infrastructure for all API use cases, and the most recent driver to API usage in the world has been AI. Over the past few months, we’ve worked with select Kong Gateway customers and users to cover their most common AI use cases as we’ve prepared to release the plugins we’re announcing today. 

More AI capabilities will be shipped in the future, and I’m looking forward to hearing your feedback.

Get started with AI Gateway

You can get started with the new AI Gateway today by reading the getting started guide. You can run them on your standalone Kong Gateway installation or on Kong Konnect, our unified API management platform.

Watch: Adopt AI and Multi-LLM Strategies in a Secure and Governable Way with Kong

Want to learn more about Kong AI Gateway? Join us to discuss the intersection of API management and artificial intelligence (AI), and see how Kong addresses the challenges organizations face in adopting AI.

Fireside Chat: Adopt AI/LLMs Securely & Governably: Leverage Kong's expertise for safe innovation

Watch Now

With Kong AI Gateway, we can address cross-cutting capabilities that all teams will need to otherwise build themselves.

What’s next

At Kong, we specialize in providing modern infrastructure for all API use cases, and the most recent driver to API usage in the world has been AI. Over the past few months, we’ve worked with select Kong Gateway customers and users to cover their most common AI use cases as we’ve prepared to release the plugins we’re announcing today. 

More AI capabilities will be shipped in the future, and I’m looking forward to hearing your feedback.

Get started with AI Gateway

You can get started with the new AI Gateway today by reading the getting started guide. You can run them on your standalone Kong Gateway installation or on Kong Konnect, our unified API management platform.

Watch: Adopt AI and Multi-LLM Strategies in a Secure and Governable Way with Kong

Want to learn more about Kong AI Gateway? Join us to discuss the intersection of API management and artificial intelligence (AI), and see how Kong addresses the challenges organizations face in adopting AI.

Topics:AI Gateway
|
Open Source
|
Product Demos
Powering the API world

Increase developer productivity, security, and performance at scale with the unified platform for API management, service mesh, and ingress controller.

Sign up for Kong newsletter

Platform
Kong KonnectKong GatewayKong AI GatewayKong InsomniaDeveloper PortalGateway ManagerCloud GatewayGet a Demo
Explore More
Open Banking API SolutionsAPI Governance SolutionsIstio API Gateway IntegrationKubernetes API ManagementAPI Gateway: Build vs BuyKong vs PostmanKong vs MuleSoftKong vs Apigee
Documentation
Kong Konnect DocsKong Gateway DocsKong Mesh DocsKong Insomnia DocsKong Plugin Hub
Open Source
Kong GatewayKumaInsomniaKong Community
Company
About KongCustomersCareersPressEventsContactPricing
  • Terms•
  • Privacy•
  • Trust and Compliance
  • © Kong Inc. 2025