
Kong Gateway
The world’s most popular open source API gateway. Built for multi-cloud and hybrid, optimized for microservices and distributed architectures.
Upgrade to Kong 2.1 open source API gateway.
Get StartedOverview of Kong’s API Gateway
Accelerate your microservices journey with the world’s most popular open source API gateway. Built on top of a lightweight proxy, the Kong Gateway delivers unparalleled latency performance and scalability for all your microservice applications regardless of where they run. Exercise granular control over your traffic with Kong gateway’s plugin architecture.
Features
Authentication
Protect your services with an authentication layer
Traffic Control
Manage, throttle, and restrict inbound and outbound API traffic
Analytics
Visualize, inspect, and monitor APIs and microservices traffic
Transformations
Transform requests and responses on the fly
Logging
Stream request and response data to logging solutions
Serverless
Invoke serverless functions via APIs
Get Started in 1 minute
Add your Service and Route on Kong
After installing and starting Kong, use the Admin API on port 8001 to add a new Service and Route. In this example, Kong will reverse proxy every incoming request with the specified incoming host to the associated upstream URL. You can implement very complex routing mechanisms beyond simple host matching.
Add Plugins on the Service
Then add extra functionality by using Kong Plugins. You can also create your own plugins!
Make a Request
...and then you can consume the Service on port 8000 by requesting the specified host. In production setup the public host DNS to point to your Kong cluster. Kong supports much more functionality, explore the Hub and the documentation.
$ curl -i -X POST \
--url http://localhost:8001/services/ \
--data 'name=example-service' \
--data 'url=http://example.com'
$ curl -i -X POST \
--url http://localhost:8001/services/example-service/routes \
--data 'hosts[]=example.com' \
$ curl -i -X POST \
--url http://localhost:8001/services/example-service/plugins/ \
--data 'name=rate-limiting' \
--data 'config.minute=100'
$ curl -i -X GET \
--url http://localhost:8000/ \
--header 'Host: example.com'
Benefits
Unparalleled Performance
Achieve the industry’s best latency performance with Kong’s ultra-performant core. Ensure seamless communication across all your services regardless of where they run.
Designed for cloud-scale
Maximize resource efficiency and minimize footprint with Kong’s lightweight core. Scale Kong nodes linearly to process trillions of API calls with zero decrease in performance or stability.
Unmatched Flexibility and Extensibility
Run Kong on any environment, platform, and in any implementation pattern – from baremetal to cloud, containers to Kubernetes. Easily extend Kong to fit any use case with custom plugins.

Want to learn more?
Request a demo, free trial or talk to our experts to answer your questions and explore your needs.