The Kong Microservice API Gateway

Kong Gateway runs in front of any RESTful API and is extended through plugins, which provide extra functionality and services beyond the core platform.


Kong Gateway easily scales horizontally by adding more nodes. It supports large and variable workloads with very low latency.


Extend Kong Gateway functionality with plugins that are installed and configured through a RESTful Admin API.

Runs on any infrastructure

Deploy Kong Gateway in the cloud, on-premises or in hybrid environments, including single or global datacenter setups.

Kong is deployed on top of reliable technologies like NGINX and Apache Cassandra or PostgreSQL, and provides you with an easy-to-use RESTful API to operate and configure the system.

  • Administer Kong via RESTful API
  • Automate/orchestrate for CI/CD & DevOps
  • Extensible with plugins
  • Create plugins with Lua
  • Implement powerful customizations
  • Integrate with third-party services
  • Choice of Cassandra or PostgreSQL
  • Scales from laptop to global cluster
  • In-memory caching for performance
  • Intercept Request/Response lifecycle
  • Extends underlying NGINX
  • Scriptable via Lua
  • Proven, high-performance foundation
  • HTTP and reverse proxy server
  • Handles low-level operations

Request Workflow

Consider a typical request/response workflow across a client, an API and the Kong microservice API gateway:


Once Kong is running, every client request being made to the API will hit Kong first and then be proxied to the final API. In between requests and responses Kong will execute any installed plugins, extending the API feature set. Kong effectively becomes the entry point for every API request.

Want to learn more?

Request a demo to talk to our experts to answer your questions and explore your needs.

    Questions? Request a demo today