Blog
  • AI Gateway
  • AI Security
  • AIOps
  • API Security
  • API Gateway
    • API Management
    • API Development
    • API Design
    • Automation
    • Service Mesh
    • Insomnia
    • View All Blogs
  1. Home
  2. Blog
  3. Engineering
  4. Exposing Kuma Service Mesh Using Kong API Gateway
Engineering
March 19, 2020
3 min read

Exposing Kuma Service Mesh Using Kong API Gateway

Kevin Chen
Topics
Service MeshKuma
Share on Social

More on this topic

eBooks

The Difference Between API Gateways and Service Mesh

eBooks

Minimizing Security Risks in API and Microservices

See Kong in action

Accelerate deployments, reduce vulnerabilities, and gain real-time visibility. 

Get a Demo

In his most recent blog post, Marco Palladino, our CTO and co-founder, went over the difference between API gateways and service mesh. I highly recommend reading his blog post to see how API management and service mesh are complementary patterns for different use cases, but to summarize in his words, "an API gateway and service mesh will be used simultaneously." We maintain two open source projects that work flawlessly together to cover all the use cases you may encounter.

So, in this how-to blog post, I'll cover how to combine Kong for Kubernetes and Kuma Mesh on Kubernetes. Please have a Kubernetes cluster ready in order to follow along with the instructions below. In addition, we will also be using `kumactl` command line tool, which you can download on the official installation page.

Step 1: Installing Kuma on Kubernetes

Installing Kuma on Kubernetes is fairly straightforward, thanks to the `kumactl install [..]` function. You can use it to install the control-plane with one click:

After everything in `kuma-system` namespace is up and running, let's deploy our demo marketplace application:

The application is split into four services with all the traffic entering from the frontend app service. If we want to authenticate all traffic entering our mesh using Kong plugins, we will need to deploy the gateway alongside the mesh. Once again, to learn more about why having a gateway and mesh is important, please read Marco's blog post.

Step 2. Deploying Kong for Kubernetes

Kong for Kubernetes is an ingress controller-based on the open source Kong Gateway. You can quickly deploy it using `kubectl`:

On Kubernetes, Kuma `Dataplane` entities are automatically generated. To inject gateway Dataplane, the API gateway‘s pod needs to have the following `kuma.io/gateway: enabled` annotation:

Our `kuma-demo-kong.yaml` already includes this annotation, so you don’t need to do this manually.

After Kong is deployed, export the proxy IP:

And check that the proxy IP has been exported; run:

Sweet! Now that we have Kong for Kubernetes deployed, go ahead and add an ingress rule to proxy traffic to the marketplace frontend service.

By default, the ingress controller distributes traffic amongst all the pods of a Kubernetes service by forwarding the requests directly to pod IP addresses. One can choose the load-balancing strategy to use by specifying a KongIngress resource.

However, in some use cases, the load-balancing should be left up to kube-proxy or a sidecar component in the case of service mesh deployments. For us, load-balancing should be left to Kuma, so the following annotation has been included in our frontend service resource:

Remember to add this annotation to the appropriate services when you deploy Kong with Kuma.

3. Add Policy

With both Kong and Kuma running on our cluster, all that is left to do is add a traffic permission policy for Kong to the frontend service:

That's it! Now, if you visit the `$PROXY_IP`, you will land in the marketplace application proxied through Kong. From here, you can enable all those fancy plugins that Kong has to offer to work alongside the Kuma policies.

Thanks for following along 🙂

Topics
Service MeshKuma
Share on Social
Kevin Chen

Recommended posts

Kong Mesh 2.12: SPIFFE/SPIRE Support and Consistent XDS Resource Names

Kong Logo
Product ReleasesSeptember 18, 2025

We're very excited to announce Kong Mesh 2.12 to the world! Kong Mesh 2.12 delivers two very important features: SPIFFE / SPIRE support, which provides enterprise-class workload identity and trust models for your mesh, as well as a consistent Kuma R

Justin Davies

Unlocking API Analytics for Product Managers

Kong Logo
EngineeringSeptember 9, 2025

Meet Emily. She’s an API product manager at ACME, Inc., an ecommerce company that runs on dozens of APIs. One morning, her team lead asks a simple question: “Who’s our top API consumer, and which of your APIs are causing the most issues right now?”

Christian Heidenreich

How to Build a Multi-LLM AI Agent with Kong AI Gateway and LangGraph

Kong Logo
EngineeringJuly 31, 2025

In the last two parts of this series, we discussed How to Strengthen a ReAct AI Agent with Kong AI Gateway and How to Build a Single-LLM AI Agent with Kong AI Gateway and LangGraph . In this third and final part, we're going to evolve the AI Agen

Claudio Acquaviva

How to Build a Single LLM AI Agent with Kong AI Gateway and LangGraph

Kong Logo
EngineeringJuly 24, 2025

In my previous post, we discussed how we can implement a basic AI Agent with Kong AI Gateway. In part two of this series, we're going to review LangGraph fundamentals, rewrite the AI Agent and explore how Kong AI Gateway can be used to protect an LLM

Claudio Acquaviva

How to Strengthen a ReAct AI Agent with Kong AI Gateway

Kong Logo
EngineeringJuly 15, 2025

This is part one of a series exploring how Kong AI Gateway can be used in an AI Agent development with LangGraph. The series comprises three parts: Basic ReAct AI Agent with Kong AI Gateway Single LLM ReAct AI Agent with Kong AI Gateway and LangGr

Claudio Acquaviva

Build Your Own Internal RAG Agent with Kong AI Gateway

Kong Logo
EngineeringJuly 9, 2025

What Is RAG, and Why Should You Use It? RAG (Retrieval-Augmented Generation) is not a new concept in AI, and unsurprisingly, when talking to companies, everyone seems to have their own interpretation of how to implement it. So, let’s start with a r

Antoine Jacquemin

AI Gateway Benchmark: Kong AI Gateway, Portkey, and LiteLLM

Kong Logo
EngineeringJuly 7, 2025

In February 2024, Kong became the first API platform to launch a dedicated AI gateway, designed to bring production-grade performance, observability, and policy enforcement to GenAI workloads. At its core, Kong’s AI Gateway provides a universal API

Claudio Acquaviva

Ready to see Kong in action?

Get a personalized walkthrough of Kong's platform tailored to your architecture, use cases, and scale requirements.

Get a Demo
Powering the API world

Increase developer productivity, security, and performance at scale with the unified platform for API management, AI gateways, service mesh, and ingress controller.

Sign up for Kong newsletter

Platform
Kong KonnectKong GatewayKong AI GatewayKong InsomniaDeveloper PortalGateway ManagerCloud GatewayGet a Demo
Explore More
Open Banking API SolutionsAPI Governance SolutionsIstio API Gateway IntegrationKubernetes API ManagementAPI Gateway: Build vs BuyKong vs PostmanKong vs MuleSoftKong vs Apigee
Documentation
Kong Konnect DocsKong Gateway DocsKong Mesh DocsKong AI GatewayKong Insomnia DocsKong Plugin Hub
Open Source
Kong GatewayKumaInsomniaKong Community
Company
About KongCustomersCareersPressEventsContactPricing
  • Terms•
  • Privacy•
  • Trust and Compliance•
  • © Kong Inc. 2025