What is an API Gateway?
An API Gateway is a middleware layer that sits between API-based applications and computing clients in order to orchestrate common functionality and scalable, distributed architectures.
It works with the features of APIs to extend them easily and consistently, adding powerful new capabilities and essential features like security, traffic control and analytics, among other features, which we will examine in more detail below. This enables you to configure and deploy the APIs you need in order to request and return mission-critical data in a consistent and usable form, with a centralized API dashboard to manage the flow of that information. This enables you to configure and deploy the APIs you need in order to request and return mission-critical data in a consistent and usable form, with a centralized API dashboard to manage the flow of that information.
The Kong API Gateway is open-source, cloud-native and platform agnostic, allowing it to be deployed in a wide range of different patterns including monolithic, service-based, serverless and microservices/service mesh-based. It is modular, with the ability to add new functionality using plugins installed and configured via a RESTful Admin API, enabling completely custom instances of the Kong API Gateway that meet the unique needs of your organization. By supporting REST APIs, Kong enables you to use, manage and update applications using the most common API standard for web browser, mobile and Internet of Things access.
A good API Gateway offers a compelling business case for SMEs and enterprise users alike, particularly in organizations that are driven by mission-critical data whether obtained from human customers via the web or mobile apps, received from IoT devices, or generated internally by employees.
Policies of the Kong API Gateway
Policies are a way to adjust the behavior of APIs without altering the programming code, acting as external configuration files that can be used to add management capabilities to the API.
They are typically used for certain common capabilities, although in principle, it is possible to write custom policies to achieve a broad range of different outcomes and to extend the functionality of APIs in innovative ways.
Some of the most common applications of API policies include:
- Traffic Limiting.
- Transformation of Data.
As we will see below, this aligns closely with some of the most powerful features of the Kong API Gateway and with some of the main outcomes clients aim to achieve by implementing it.
What this means is that you have excellent control over the capabilities of your APIs at all times; you can adjust API policies and alter the results you get back from your APIs, without the need to alter the API programming code itself.
Learn the relatively simple schema for the policies you need to adjust, and you can alter them directly, allowing those changes to be made much faster and with greater flexibility, and in quick response to any changing circumstances in your data processing needs.
In addition to the rapid deployment and ease of updating microservices, which we will look at later, this means your administrative and development teams are freed up to focus on the tasks that require their full attention, without investing time unnecessarily into redundant activity.
By allowing you to define and update policies as you see fit, Kong puts you in total control of your API Gateway deployment, and of the APIs you use daily to achieve your business activities.
Features of the Kong API Gateway
Kong is open source, which in principle, means it can be developed to achieve any purpose, but as standard, a Kong API Gateway deployment creates an environment in which all of your APIs can work together.
Instead of receiving API data via separate streams that cannot interact, Kong gives you the option to aggregate that data into a single versatile datastore where it can all be processed and manipulated together to achieve your business goals.
The Kong API Gateway also offers several features that give you robust control over security, traffic and visualization of performance data.
Some of those features include:
- Analytics: Detailed data analytics to inspect, monitor and visualize your microservices traffic.
- Authentication: An authentication layer to protect your services and ensure only authorized access.
- Logging: Real-time and continuous logging of API request and response data.
- Serverless: Deploy serverless functions without the need to restart or redeploy Kong.
- Traffic Control: Granular control over inbound and outbound API traffic, including the ability to throttle and restrict traffic as required.
- Transformations: Handle request and response data transformations on the fly.
These features are delivered via the powerful open-source Kong API Gateway platform, built on a lightweight proxy that gives huge scalability and unbeaten latency performance for your microservices applications.
Crucially, for enterprise clients, Kong can be scaled horizontally across multiple nodes, including private datacenter environments and public clouds, with excellent security and end-to-end encryption as required for compliance and data protection.
Weighted load balancing keeps everything running smoothly for industry-leading latency performance and ensures zero decreases in stability, even when scaling to process trillions of API calls.
Why use an API Gateway?
The enterprise use case for an API Gateway is compelling. It gives you centralized control over your APIs and microservices and, by extension, the data that you request and return from them. You can learn more about using microservices with an API gateway here.
APIs themselves are a secure, modular and reusable way to collect data. With good internal and external API documentation, you can ensure that developers and clients alike have clear instructions on their use.
REST APIs, in particular, have become a powerful way to implement new web applications with powerful and intuitive user interfaces, improving the user experience for clients while simultaneously improving the quality of the data you receive.
Some of the main benefits of the data received via APIs include:
- Accuracy: Data is returned directly from an authoritative source and can be verified to ensure that it is complete and correct.
- Cleanliness: APIs are designed to return data in a usable, accessible format, free from ‘bad’ data and capable of being retrieved by your system in the preferred filetype.
- Speed: Data is returned in real-time, giving much faster updates than manually pulling information from the server.
By implementing APIs using the Kong API Gateway, you gain access to all of the advantages of APIs along with centralized management and flexible deployment of new modules and microservices.
This is ideal for organizations of all sizes, whether you are a small to medium-sized business with a growing list of needs for new APIs, or a vast enterprise with many different APIs that would benefit from a central, consistent management platform.
Kong itself offers an even more compelling case than most API Gateways. It is open source, which means enterprise development teams can modify the API Gateway itself in order to create truly bespoke deployments.
By using industry standards like JSON and HTTP/HTTPS, Kong is widely compatible with many microservice development tools, web technologies and orchestration platforms, and is extensively tested to ensure compatibility across different datastores, web and proxy servers, operating environments and APIs.
Kong API Gateway Best Practices
For rapid deployment of the Kong API Gateway, follow the five-minute quickstart guide. This will take you through the basics of starting Kong, checking that it has started successfully, stopping it and/or reloading it without downtime.
The Quickstart guide requires Kong to be installed and your database connection settings to be correctly configured.
If everything goes well, within just a few minutes, you can have the Kong API Gateway up and running, giving you access to the RESTful Admin interface to manage Consumers, Routes and Services.
Migrating Data from other API Gateways
Migrating to the Kong API Gateway is relatively easy, and just requires you to migrate your data and your network settings respectively.
Depending on your existing API Gateway, you may need to write a simple script to convert exported JSON or CSV data into requests that trigger Kong to provision the appropriate APIs, Consumers and Plugins.
Data migration is made easier thanks to Kong’s RESTful API to migrate existing data into Kong, and you can check that everything has transferred across correctly in a staging environment before you update your network settings to point to your Kong server or cluster.
Clustering is a powerful feature of Kong’s scalability, giving you the ability to add more machines to handle incoming requests — effectively scaling your API Gateway horizontally.
In order to enable multiple nodes to work together as a single API Gateway, you must make sure that they belong to the same cluster.
Clustering allows the different nodes to share the same datastore and can be combined with load balancing to distribute traffic equally across all of your nodes.
Kong supports multiple types of health checks to identify unhealthy targets on individual Kong nodes.
- Active Checks: Periodically request a specific HTTP or HTTPS endpoint and mark it as healthy or unhealthy based on its response.
- Passive Checks: Analyze proxied traffic on an ongoing basis to determine the health of targets. This method is also known as a circuit breaker.
By actively and passively monitoring the health of targets, you can take remedial action when needed to restore functionality and ensure all nodes in a Kong cluster have access to the endpoints they require.
Kong allows load balancing using several different methods:
- A Records: Using an A record containing multiple IP addresses, all entries will be treated equally in a round robin.
- DNS-based: DNS-based load balancing allows backend service registration to occur outside of Kong with periodic updates from the DNS server.
- SRV Records: SRV records can contain IP addresses, port information and weighting, allowing multiple instances of a service to run via different ports on the same IP address.
The last method is particularly useful as the weighting allows the load balancer to adjust individual services according to their weighting, rather than treating them all equally.
Secure Admin API
Kong’s Admin API gives full control of your Kong installation and should be secured against access by unauthorized individuals.
There are many ways to do this, including network layer access restrictions, specified IP ranges for external access, and fine-grained access control by using Kong itself as a proxy for access to its own Admin API.
Enterprise users also benefit from role-based access control which can use user roles and permissions to grant access to the Admin API, with excellent scalability for case-specific and complex uses.
What issues might I encounter?
Kong is designed to be agile, flexible and massively scalable, and with some sensible preparedness, it should be possible to avoid encountering any significant issues even in complex enterprise deployments.
Here are some issues to keep in mind, especially when scaling Kong across multiple nodes or in a way that significantly alters the size of the Kong datastore.
Scaling Kong Server
You can scale Kong Server upwards by adding new nodes as required. Remember the above best practice on clustering when doing this.
New nodes must point to the same Kong datastore in order to interoperate, and you should ensure that new nodes are also subject to load balancing to ensure good performance.
Scaling Kong datastore
Kong datastore traffic is typically quiet because Kong maintains its own cache; however, it is important not to allow the datastore to become a single point of failure for your organization.
This can be prevented simply through close monitoring of the datastore and by keeping an up-to-date backup in case of emergencies.
The Kong API Gateway is fully platform agnostic, which means you are free to use public cloud environments and/or private datacenter servers as you wish.
Moving applications is easy between different cloud platforms or physical servers, and Kong can operate in a hybrid environment that combines the two into a single configuration, so there should be no issues when moving applications to any type of platform.
Where to find help
If you encounter any issues that you don’t know how to resolve, Kong Nation is the place to go.
This dynamic forum is where Kong users from all over the world come together to share tips and tricks, best practice guidance, and to help each other resolve any issues as they arise.
It’s also where Kong Inc. and other community members can make announcements, so you are always in the loop when new features are implemented, or other important news is announced.
Want to learn more?
Request a demo to talk to our experts to answer your questions and explore your needs.