Governing GraphQL APIs with Kong Gateway
Modern software design relies heavily on distributed systems architecture, requiring all APIs to be robust and secure. GraphQL is no exception and is commonly served over HTTP, subjecting it to the same management concerns as any REST-based API. In fact, GraphQL’s dynamic client querying capabilities may lead to more complex and potent attack surfaces than traditional REST-based APIs.
The GraphQL official documentation makes the following recommendation:
GraphQL should be placed after all authentication middleware, so that you have access to the same session and user information you would in your HTTP endpoint handlers.
In this post, we will evaluate proxying GraphQL endpoints with Kong Gateway, which provides exceptional capabilities for protecting and governing multiple API technologies.
Our previous post looked at building an experience API using GraphQL for our fictitious airline, KongAir. Today, we’ll look at managing and protecting that API with Kong Gateway on Kong Konnect using APIOps.
The API gateway layer
API gateways are prominently situated at the edge of your infrastructure and negotiate traffic between front-end clients and back-end services.
Modern API gateways must provide some key functionalities, including:
- Traffic Controls with features such as request routing, load balancing, and rate limiting
- Security Controls including authentication and authorization to conforming to a number of common security standards and protocols
- Monitoring, logging and analytics to support both open and proprietary standards
- Performance capabilities like caching and gateway level request & response transformations
Critically these functions abstract these requirements away from your service developers allowing them to focus on business development needs.
Kong Gateway and GraphQL Plugins
Kong Gateway provides all of the above capabilities and more, including native GraphQL support for caching and rate limiting.
GraphQL Proxy Caching
In the modern distributed architecture, performance optimization and rapid response times are pivotal. Proxy caching is a common technique for improving response times and reducing upstream service loads. With Kong GraphQL Proxy Caching Advanced plugin, users get a simple mechanism to deploy reverse GraphQL proxy cache.
The caching plugin keys each cached element based on the GraphQL query that has been sent in the HTTP request body and, optionally, a configurable set of request headers. The caching plugin supports a configurable time to live (TTL) for admins to control the lifetime of a cached response.
GraphQL Rate Limiting
Resource protection is critical to ensuring robust and available systems. Rate Limiting is commonly used to mitigate denial-of-service attacks, helping to maintain system health and stability. GraphQL services are no different, and the Kong GraphQL Rate Limiting Advanced plugin provides a GraphQL-aware algorithm for calculating request cost.
GraphQL services typically respond to queries from a single HTTP path, and due to the nature of dynamic client GraphQL queries, normal HTTP path-based rate limiting is not sufficient. The same HTTP request to the same URL with the same method can vary greatly in cost depending on the semantics of the GraphQL operation in the body. See the documentation for more details on how request costs are calculated.
Deploying GraphQL services and plugins with APIOps
In the previous post, we introduced the KongAir Experience API built with GraphQL. Today, we are going to look at deploying GraphQL plugins on Kong Gateway in front of our GraphQL service, protecting it and improving overall system performance.
KongAir follows an automated APIOps process using Kong’s decK tooling to deploy APIs onto Kong Konnect, a complete SaaS API management platform. All of the code we will look at in this post is available in the KongAir GitHub repository.
Services, routes and plugins are configured using a combination of OpenAPI specifications and decK declarative format. Let’s start by looking at the decK declarative definition that configures our GraphQL service:
The above is a snippet from a partial decK file that includes a Kong Gateway service declaration for our GraphQL service. This section specifies the connectivity details allowing our Kong Gateway to proxy traffic to our upstream GraphQL service. The route matches the URL path our GraphQL service listens on /query.
Next, let’s focus on the GraphQL plugins we associate with this service, from the same file:
We’ve associated 3 plugins with our GraphQL service:
- The
jwt
plugin allows us to protect the service by requiring and validating JSON Web Tokens. The configuration for this plugin is shared with other services in the KongAir system, and the decK config de-duplication feature is used via the_config
key, which indicates a shared configuration. - The
graphql-proxy-cache-advanced
configuration provides caching services for any request bound for the experience service. In this example, we configure the plugin to store cached responses in memory and expire them after 15 seconds. - Similarly, the
graphql-rate-limiting-advanced
plugin is installed and is configured with a request cost limit of 500 per 120-second window. Limits are calculated by default per Consumer, but can be modified to be per IP address or credential.
The complete file is available in the GitHub repository.
As mentioned earlier, KongAir uses decK and APIOps to automate the delivery of APIs to Kong Gateway. GitHub actions are used along with various decK APIOps features, to assemble a final declarative configuration and synchronize it to Kong Gateway.
The relevant command that merges our experience API definition looks like the following:
These commands are part of a broader set of GitHub actions to assemble, validate, and deploy the configuration. The above command can be found in the stage-changes-for-kong.yaml file.
These APIOps workflows follow a GitOps style methodology and use GitHub Pull Requests to stage changes to the KongAir production system.
Using these technologies together allows a combination of automation and human oversight to ensure auditable and accurate changes to the gateway configuration.
Kong Konnect
The KongAir APIOps automation deploys to a Kong Konnect managed Kong Gateway. Konnect provides KongAir a full API management solution. Here are some example benefits of the Konnect platform.
Using Konnect’s Gateway Manager, we have clear visual displays showing our deployed security and GraphQL traffic management plugins. From this display, we can investigate all aspects of the plugins ensuring proper configuration and deployment.
Konnect’s Analytics capabilities allow us to gain further insight into our API usage. Customized reports allow us to focus on individual API metrics important to the observability of our system.
Client Verification
Now that we have deployed the GraphQL native plugins in front of our GraphQL API, let’s verify they are working as expected from the client's perspective.
When a client makes a GraphQL request now, they are subject to the rate-limiting plugin and may benefit from a cache hit on their request. For example, here is an example query showing the return headers from Kong.
From the response headers, you can see the plugins in action.
The rate-limiting plugin has calculated the cost of our request at 12, reported in the x-gql-query-cost
header. This value is calculated using the default strategy for cost calculation.
The GraphQL caching plugin has also responded with headers indicating its processing of the request. In the response, we see that a cache key for our request has been calculated and reported in the x-cache-key
header. In this example, it was determined there was a cache miss which is indicated in the header x-cache-status
. If a Consumer makes a duplicate request within the plugin’s cache_ttl
time constraint, you would see x-cache-status: Hit
instead.
Summary
Governing GraphQL APIs is critical to your system's availability and robustness. Kong Gateway provides core GraphQL native plugin capabilities for rate limiting and response caching. These plugins paired with Kong Konnect provide a full-featured unified API management solution for your GraphQL services. For more advanced usage of GraphQL with Kong, see our solution page on next generation API platforms with Kong and Apollo and get started with Kong Konnect today for free.