• The API Platform for AI.

      Explore More
      Platform Runtimes
      Kong Gateway
      • Kong Cloud Gateways
      • Kong Ingress Controller
      • Kong Operator
      • Kong Gateway Plugins
      Kong AI Gateway
      Kong Event Gateway
      Kong Mesh
      Platform Core Services
      • Gateway Manager
      • Mesh Manager
      • Service Catalog
      Platform Applications
      • Developer Portal
      • API and AI Analytics
      • API Products
      Development Tools
      Kong Insomnia
      • API Design
      • API Testing and Debugging
      Self-Hosted API Management
      Kong Gateway Enterprise
      Kong Open Source Projects
      • Kong Gateway OSS
      • Kuma
      • Kong Insomnia OSS
      • Kong Community
      Get Started
      • Sign Up for Kong Konnect
      • Documentation
    • Featured
      Open Banking SolutionsMobile Application API DevelopmentBuild a Developer PlatformAPI SecurityAPI GovernanceKafka Event StreamingAI GovernanceAPI Productization
      Industry
      Financial ServicesHealthcareHigher EducationInsuranceManufacturingRetailSoftware & TechnologyTransportation
      Use Case
      API Gateway for IstioBuild on KubernetesDecentralized Load BalancingMonolith to MicroservicesObservabilityPower OpenAI ApplicationsService Mesh ConnectivityZero Trust SecuritySee all Solutions
      Demo

      Learn how to innovate faster while maintaining the highest security standards and customer trust

      Register Now
  • Customers
    • Documentation
      Kong KonnectKong GatewayKong MeshKong AI GatewayKong InsomniaPlugin Hub
      Explore
      BlogLearning CentereBooksReportsDemosCase StudiesVideos
      Events
      API SummitWebinarsUser CallsWorkshopsMeetupsSee All Events
      For Developers
      Get StartedCommunityCertificationTraining
    • Company
      About UsWhy Kong?CareersPress RoomInvestorsContact Us
      Partner
      Kong Partner Program
      Security
      Trust and Compliance
      Support
      Enterprise Support PortalProfessional ServicesDocumentation
      Press Release

      Kong Advances Konnect Capabilities to Propel Today’s API Infrastructures into the AI Era

      Read More
  • Pricing
  • Login
  • Get a Demo
  • Start for Free
Blog
  • Engineering
  • Enterprise
  • Learning Center
  • Kong News
  • Product Releases
    • API Gateway
    • Service Mesh
    • Insomnia
    • Kubernetes
    • API Security
    • AI Gateway
  • Home
  • Blog
  • Product Releases
  • Enhance AI Governance with Kong AI Gateway 3.10
Product Releases
April 2, 2025
5 min read

Enhance AI Governance with Kong AI Gateway 3.10

Adam Jiroun
Senior Product Marketing Manager, Kong

Kong AI Gateway 3.10: Enhancing AI Governance with Automated RAG and PII Sanitization

Today, we’re excited to unveil Kong AI Gateway 3.10!

This release introduces new functionality to enhance AI governance, reduce LLM hallucinations, and accelerate developer productivity when working with AI. Read on to learn more.

Reduce LLM hallucinations with automated RAG 

Kong AI Gateway 3.10 ships with a new AI RAG Injector plugin that will automate part of the RAG (Retrieval-Augmented Generation) process to help reduce LLM hallucinations and improve the accuracy of the responses provided by AI. To better understand the value of this update, we have to first understand what LLM hallucinations are, why they occur, and how RAG helps solve for this today. 

A “hallucination” in this context refers to when an LLM provides an inaccurate response with confidence. This can often occur when domain-specific data is required to provide a relevant response, but the LLM is limited to its static, pre-trained data. To address this, many organizations are turning to RAG to help them reduce hallucinations. RAG provides the LLM with direct access to vetted data sources — enabling the LLM to retrieve the relevant information it needs to deliver more accurate responses. 

RAG is an effective technique, but the traditional implementation process for RAG can be quite manual and time-intensive for developers — involving the need to pre-process data, generate embeddings, and then query a vector database to associate the data with each AI model. 

In 3.10, Kong AI Gateway will streamline this process by automatically generating embeddings for an incoming prompt, fetch all relevant data, and then append it to the request, removing the need for developers to build this association themselves.

Implement automated RAG to reduce LLM hallucinations and ensure higher-quality responses. 

This update now makes it possible to operationalize RAG as a platform capability — shifting more of the RAG implementation responsibility away from the individual developers and enabling the platform owners to enforce global guardrails for RAG. 

At the end of the day, automated RAG in Kong AI Gateway will provide organizations with the ability to more effectively reduce LLM hallucinations and deliver higher-quality AI responses for the end-user.

You can read more about RAG and how it works here.

Safeguard sensitive data with automatic PII sanitization 

When looking to safely roll out projects with LLMs and AI agents, sanitizing PII (Personally Identifiable Information) is critical to ensure organization-wide AI compliance and to eliminate PII risk when using AI. Today, many development teams may attempt to manage PII risk with ad hoc solutions like regex-based redaction libraries filters or by simply hardcoding filters, but these approaches are often unreliable and prone to human error — making it difficult to ensure consistent security and compliance as AI usage scales.

Kong AI Gateway helps solve for this in 3.10 with out-of-the-box support for PII sanitization — providing the ability to easily sanitize and protect personal data, passwords, and 20+ more categories of PII across 12 different languages and most major AI providers. This capability is delivered via a private Docker container that can be fully self-hosted and deployed next to Kong AI Gateway data planes — providing a secure solution that is also performant and horizontally scalable.

While other sanitization products may be limited to replacing PII data with a token or by redacting it entirely, Kong AI Gateway provides the option to reinsert the original sanitized data back into the response before it reaches the end user (based on configuration settings defined by the platform owner). This ensures that the end-user experience is never compromised by the sanitization process, while also ensuring that the LLMs are not ingesting sensitive data.

Finally, this update enables platform teams to enforce sanitization at the global platform level, removing the need for developers to manually code the sanitization into every application they are building, and freeing them up to focus on core application functionality. This approach also empowers the platform owners to configure and manage sanitization policies in a centralized manner — helping to ensure that the sanitization is implemented consistently across time.

Sanitize and protect PII across 12 different languages and most major AI providers

To learn more about PII sanitization and how it drives business impact, read more here.

Simplify model consumption with native SDK support

With Kong AI Gateway, you can consume AI models with a universal API based on the OpenAI format, which Kong will then translate at the runtime layer to work seamlessly across all popular AI providers. This improves developer productivity for teams who are working with multiple models, allowing them to efficiently switch between the best models for the job. Now in 3.10, we’re introducing an additional way to consume AI models with native SDK support. 

This means for any customers who have already built applications using SDKs from popular providers such as AWS or Google, they can keep their existing client libraries without having to rewrite code or disrupt current workflows. This backwards compatibility should help simplify migration to Kong and provide a faster path to centrally governing all AI usage across your organization. Kong AI Gateway is also compatible with existing developer frameworks like LangChain, LLamaIndex, AutoGen, and more.

Load balance based on tokens and costs 

We’re introducing cost-based load balancing in the ai-proxy-advanced plugin. This capability enables Kong AI Gateway to intelligently route requests to different models based on token usage and cost—allowing you to use the right model for the job without overspending. This is in addition to the many other load balancing algorithms that Kong supports out-of-the-box, such as semantic routing, lowest latency, lowest usage, weighted load balancing, and more.

For simple queries with fewer tokens, you can route them to more cost-effective models, while more complex prompts can be routed to more advanced models. This provides customers with a more efficient way to manage their AI workloads and costs, especially for teams working with multiple models across different use cases.

Unlock new use cases with pgvector 

We are broadening Kong AI Gateway’s vector database support with pgvector. This means that all of Kong’s semantic plugins — including semantic routing, caching, prompt guardrails, and more — can now store embeddings in Postgres, not just Redis. 

Since pgvector is a Postgres extension, you can use it with cloud-hosted options such as AWS RDS, Azure Cosmos DB, and more. This update will provide customers with more flexibility and make it easier to introduce semantic AI capabilities to existing cloud-native workflows. 

With every new release going forward, Kong will continue to add support for more vector databases.

Get started with Kong AI Gateway 3.10 today

All of these updates and much more — including prompt guard plugin enhancements and new embedding model support — are generally available today. To see the full list of updates and fixes, read the changelog here. 

To get started with this new release of Kong AI Gateway 3.10, you can visit the official product page.

Want to learn more about moving past the AI experimentation phase and into production-ready AI systems? Check out the upcoming webinar on how to drive real AI value with state-of-the-art AI infrastructure.

AI-powered API security? Yes please!

Learn MoreGet a Demo
Topics:AI
|
AI Gateway
Powering the API world

Increase developer productivity, security, and performance at scale with the unified platform for API management, service mesh, and ingress controller.

Sign up for Kong newsletter

Platform
Kong KonnectKong GatewayKong AI GatewayKong InsomniaDeveloper PortalGateway ManagerCloud GatewayGet a Demo
Explore More
Open Banking API SolutionsAPI Governance SolutionsIstio API Gateway IntegrationKubernetes API ManagementAPI Gateway: Build vs BuyKong vs PostmanKong vs MuleSoftKong vs Apigee
Documentation
Kong Konnect DocsKong Gateway DocsKong Mesh DocsKong Insomnia DocsKong Plugin Hub
Open Source
Kong GatewayKumaInsomniaKong Community
Company
About KongCustomersCareersPressEventsContactPricing
  • Terms•
  • Privacy•
  • Trust and Compliance
  • © Kong Inc. 2025