Why Do Microservices Need an API Gateway?

An API gateway is a software application between a client and a set of backend microservices.

The API Gateway serves as a reverse proxy to accept API calls from the client application, forwarding this traffic to the appropriate service. The API gateway also enforces security and ensures scalability and high availability.

 

Why Do Microservices Need an API Gateway?

 

API Gateway architecture this article will look at why API Gateways are critical to a microservice architecture, and we’ll cover the common features of an API gateway. Along the way, we’ll learn about the challenges that an API gateway can overcome and some of the caveats to bear in mind when using an API gateway.

Why Do We Need API Gateways?

The various advantages of a microservice architecture subsequently present unique challenges that API gateways are purpose-built to address.

Centralized access to decentralized microservices

A microservice architecture modularizes the many functions of an application so that each service can focus on implementing a specific business rule. This design pattern makes it easy to develop, test, deploy, and maintain different capabilities of an application. However, this approach also means increased complexity for clients to access those services.

However, an API gateway can handle several API calls simultaneously and route them to different backend services. It can also decompose a single client call into multiple requests to other microservices and aggregate the results when they respond.

Management and discovery for scalable, distributed services

The elastic nature of the cloud allows for the horizontal scaling of services as demand increases. However, achieving this requires efficient load balancing, easier ways to discover services, and resiliency features like retries and timeouts.

API Gateways can load balancing between replicas of the same service, which allows better resource utilization. Traditional proxies employ basic load balancing algorithms like Random or Round Robin. At the same time, API Gateways can use more sophisticated algorithms like weighted connections, least connections, or even custom implementations that leverage the service registry. These techniques can provide efficient traffic routing. It can do all this while implementing features like retries when a service is down, rerouting requests to healthy service instances, or graceful error handling.

Abstraction for microservice language and protocol independence

Developers can build different microservices by using another technology stack and protocol. For example, they could develop a service in Java and implement it as a RESTful API, or write it in Go and implement it with a gRPC protocol interface. While this language and protocol independence yields immeasurable flexibility, it also means clients accessing these APIs need to understand and implement different communication protocols.

However, the API Gateway can translate between protocols, allowing clients to call any service with a single protocol. API Gateways effectively hide the service implementations from client applications. As long as the interface definition remains the same, developers can implement changes to the service logic using any technology stack without the client knowing it.

This means a service written in Java can be rewritten in another language like Python and redeployed behind the Gateway. To the client, the interface will remain the same, and it can call the same function with the same parameters and get the same results without realizing the underlying technology and logic have changed.

Routing to microservices based on deployment strategies

Feature improvements and bug fixes may require special handling at deployment time. Development teams may adopt a deployment strategy like canary release or blue/green deployments. To do so, they need to ensure that they’ve optimized their CI/CD pipelines for these strategies.

API gateways can be configured to route traffic following those strategies, switching or routing traffic between old and new service versions accordingly.

Traffic control to prevent overloading of resources

Internet-facing services are targets for malicious attacks and exploitation. They need to handle sudden surges in requests from valid users and bad actors. This means request throttling and blacklist capabilities are necessary to keep systems reliable and secure.

API gateways are an effective barrier against Distributed Denial of Service (DDoS) attacks—throttling the number of requests made to affected services, so the service isn’t overwhelmed and protecting it against becoming unresponsive.

Common Features and Benefits of API Gateways

An API Gateway is located at the outer edge of your microservices and acts as a proxy to manage all ingress traffic. From here, the API Gateway handles several features.

Authentication, Authorization, and Audit

Every request made to your service needs to be authenticated to ensure only valid users have access. An API Gateway authenticates all traffic before routing it to the called service. The API Gateway can perform the authentication itself or use external authentication providers for that task.

Authorization is the process that ensures your users have the necessary permissions to request the action from a service. As with authentication, an API Gateway can use its access control lists (ACL) for this authorization or fetch authorization information from an external service provider.

Since all calls must pass through the API Gateway, this provides system administrators with an invaluable audit trail of who, when, and what—helping to troubleshoot errors, monitor performance, or address security incidents.

Traffic Management

Traffic management is a broad category core to any API Gateway. The most basic capability offered by the Gateway is a virtual endpoint for API clients, which remains constant over time. This minimizes the need for disruptive change with clients when new versions of services are deployed, leading to an overall better developer experience.

With cloud-native orchestration platforms like Kubernetes, it’s easy to scale services across multiple nodes within a cluster and across clusters. However, this also means ephemeral services with locations (IP address and port) constantly changing. An API Gateway can provide a service registry that keeps track of available instances of each service.

As services integrate with the API Gateway, they can self-register with the service registry and report their availability. It’s also possible to use third-party registration services. With a service registry, the Gateway knows where to route the client request at any time.

Threat Protection

An API Gateway can protect your services from spikes or DDoS attacks. DevOps teams can smooth and throttle traffic by implementing failure patterns like circuit breaking, protecting services from becoming overwhelmed with requests.

By inspecting incoming requests, an API Gateway can prevent attacks. It can integrate with other applications like Security Information and Event Management (SIEM) systems or fraud detection systems to decide if a request is good or bad. Some of its tasks may include:

  • API firewalling: mitigates application-level threats like Cross-Site Request Forgery (CSRF) or SQL injection (some API gateways can integrate with WAFs) as well as detects and blocks threats according to the OWASP Top Ten or by blacklisting bots.
  • Content validation: ensures each request has the correctly formatted input parameter values for the intended service and adheres to the API’s published interface.
  • Integrity validation: confirms that requests have not been tampered with and that encrypted data in the request or response body remains confidential.

Monitoring and Observability

An API gateway can collect metrics, logs, and traces about all inbound traffic passing through it. For example, based on this data, it can track request and response times, traffic patterns, and error trends. It can log important messages in either direction to help troubleshoot errors and improve the observability of a system. The location of the API gateway in the microservice architecture also allows it to add tracing information like correlation ID, enabling visibility of requests through the system.

When API gateways integrate with a unified observability solution, the result is a single pane of glass to check a distributed application’s overall health and performance. An API gateway’s proactive alerting rules can report on the health of services before they become unresponsive.

Caveats of using API Gateways

Though API gateways bring many benefits, there are caveats that organizations should bear in mind.

First, an API Gateway is a new software component that a DevOps (or APIOps) team needs to deploy, configure, and maintain. This means there’s an extra moving part in the architecture, which means additional cost (money, time, effort) and a learning curve.

Adding an extra component between clients and the microservices means an extra network hop for requests. Depending on the network speed, this can have an adverse effect on the service response time.

As the entry point for all your services, an API Gateway could potentially become a single point of failure and take your entire system offline when it becomes unavailable. For this reason, API gateways are often configured to be highly availability (HA).

Similarly, an API Gateway can be a single point of attack for hackers and malicious players. A compromised API Gateway can expose your microservices to further attacks.

When selecting an API Gateway, it’s important to remember that there are pros and cons to all architectural decisions. Likewise, there are also advantages and disadvantages to consider with specific tools. There’s no silver bullet architecture or tool that can provide an extremely effective solution without caveats. An API Gateway is no exception.

Final Words

An API Gateway can enhance your microservice-based application with capabilities like traffic management, load balancing, threat protection, and observability. However, before adding it to your application architecture, you need to understand the problems your application needs to address and if the product you are choosing offers those features.

Kong Gateway is one of the most popular API gateway implementations in the market today. It was built to run on decentralized architectures, leveraging GitOps practices. With Kong, you’ll be able to configure advanced routing, load balancing, request transformation, logging, monitoring, and health checking rules.

You’ll be able to easily configure authentication with popular methods like JWT and configure authorization through ACLs. It also makes it easy to proxy traffic, perform SSL/TLS termination, and supports connectivity at layer 4 and layer 7 traffic. With its vast library of plugins, Kong is built with extensibility in mind.