It’s Time to Bring Kafka Event Streaming into Your API Platform
Unify the API and Eventing Developer Experience with the Kong Event Gateway and API Platform
Introduction: The EDA and API worlds are converging . . . finally
For the past several years, there have been murmurs of an incoming convergence between API management and event management. This has been more than a vendor-led phenomenon; industry analysts — such as Gartner and Forrester — see the worlds colliding as well, which is why Gartner mentions event streaming and event-driven architectures in the 2024 Gartner Magic Quadrant for API Management report, and why David Mooter from Forrester was writing in 2022 about the need to “expand your API strategy” to cover event-driven and real-time messaging use cases.
While the analyst world has been speaking about this convergence for some time, the larger market has taken a bit longer to catch up.
But no longer.
Without fail, almost every platform team lead and/or enterprise architect that I speak with nowadays mentions that they have the “unification” (this is the word they typically use) of APIs and events on their internal roadmaps for 2025–2026.
Why is this? And why are we blogging about it here at Kong? Keep reading to find out!
Why is the convergence finally happening?
Let’s start with why it should happen.
Many of the challenges on the EDA side of the house are very similar to challenges that have been solved for on the “traditional API” (I don’t love the word “traditional” here, as real-time communication, WebSockets, etc. aren’t new at all, but most API folks are thinking of REST as “traditional,” so I’ll keep it) sides of the house by API management. For example:
- Securely expose event streams for internal and external consumers → the traditional API side of the house has this figured out with the API gateway as the secure exposure and proxy layer
- Package event streams as self-serve data products so that developers (and maybe even paying customers) can easily discover and start consuming real-time events → the traditional API side has figured this out with the API developer portal as self-serve API catalog
So, Gartner and Forrester are right when they say that API management vendors and organizational leaders of API strategy should be thinking about expanding traditional API platform functionality to the event side of the house. The technology and approaches are there. Why force the market — or engineering teams — to build the same solutions over again simply to support a different set of protocols and communication paradigms?
Team topology and improper attitude towards the API as blockers
Now, let’s focus on why this convergence really hasn’t happened . . . until now, at least.
As obvious as the “why not use API Management for events?” question might seem, the trend really didn’t start picking up until now.
There are two clear culprits for the slow uptake. The most guilty culprits are organizational topology and approach or attitude to the API. Here’s what I mean.
Team topology: API teams and EDA teams as opposed to unified platform teams
The real blocker of the unification of APIs and events comes down to organizational topology and a language problem. Historically, most organizations have had pretty clear distinctions between “API teams” and “event teams.” Even if the kinds of jobs to be done were similar (i.e., expose access to production and consumption of data from APIs or event streaming resources), the titles would be similar, but still different. For example, I often speak to API enterprise architects vs. EDA enterprise architects and API owners vs. data owners — among others.
This distinction led to major breakdowns of communication when it came to properly marketing solutions to the various problems that the EDA teams had, simply because these different teams used different languages.
Even when API gateways started to “support Kafka,” the Kafka EDA teams had no time to hear about “API gateways” or “API management.” This is “API team” stuff, and those Kafka teams are EDA teams. I often would speak to EDA teams about why it makes sense to use API management for EDA, and I’d often get blank stares, glazed-over eyes, and/or “What’s that?” responses — until I got to the actual functionality of what the solution offered.
When I’d start speaking in terms of like:
- “You could use the gateway as an event proxy, where you could expose events over something like HTTP for HTTP clients or over native protocol for Kafka clients”
- “You can use the API developer portal as a place to catalog both API products and — for event streaming — data products
- “You can actually expose events as APIs so that clients can produce and consume data to and from Kafka topics over HTTP, for example”
…teams would start getting it. And, I’d actually sense a lot of enthusiasm.
Many organizations — especially now — are finally realizing the problems that they have around exposing events for production and consumption. Some even realized the problems a while ago and essentially built an API gateway that could expose something like Kafka as a REST API or a WebSocket API — but they called it by what it did, which was an “event proxy.” Oftentimes, these organizations were starting to tackle the self-serve discovery problem and were building their own custom “event portal,” which would eventually live as yet another source of truth for discovery of and registration for business-critical services. This would live in addition to the API catalog or developer portal that “those API teams” used.
These custom solutions were typically costing organizations much more than a vendor solution would and were keeping engineering teams from focusing on core business logic. So, once the concepts became clear and in their own language, these teams started to open up to the possibility of adopting something like an API gateway and use it as an event gateway.
The “platform era” is changing things
Now, when I speak to organizations, the need for unifying APIs and events is actually quite clear. It doesn’t take as much explanation and translation. Oftentimes, I don’t even have to ask about APIs and events. I just have to ask: “What are you focused on this and next year?” and I get, “We want to unify how we manage APIs and events in our platform” almost all of the time, especially when speaking with larger organizations.
What’s changed?
The major trend I’ve noticed — and it speaks to the topology challenge mentioned earlier — is that I’m speaking with platform engineers and platform teams.
This makes sense if you really think about it.
If the “API teams” responsibilities were to “make APIs work” and the EDA teams were there to “make EDA and eventing work,” there was understandably little room for an EDA convo in the API team and often even less room for an API convo in the EDA team. Their remits were inherently siloed along lines of the kind of business-critical service.
But, the platform team’s responsibility is different. The platform team is about breaking down silos and is all about ensuring that engineers as consumers of services have access to all of the business-critical services that they could use to build and that engineers as producers have access to all of the tooling and process that they could use to build services for consumers.
The platform team is inherently not service specific. They must figure out how to include as many different services in their developer platform as possible, as the more services brought into the platform, the more innovation potential that engineers have. Why limit app devs to just having access to REST APIs? Why not equip them with access to real-time data sources, too? After all, we know that top-notch real-time businesses blow lower-quartile real-time businesses out of the water when it comes to business performance and CX.
As more and more platform teams take over both API and event streaming/EDA responsibilities, there's a natural move toward a desire to unify access to, processes for building, the DevEx, etc. for APIs and events as business critical services (and yes, for AI too, but we aren’t talking about that here).
The need is here, and platform teams want a platform for APIs and events, but the market has still been slow to develop
However, the unfortunate truth was that — up until very recently — offerings in the market were pretty weak. There are a few very recent entrants into the “events as APIs” use case that offer point solutions for API gateways that solve for EDA and eventing use cases. But these are typically weaker for traditional API use cases, with the vendors having spent most of their time and effort building these specialized solutions instead of tackling the core fundamentals of the API platform use case, such as:
- Making it easy to “get to a gateway” through self-serve API infra provisioning
- Deep automation and APIOps support that platform team
- Prioritizing both sides of API and service discovery, where consumers could discover API and event streaming services for registration and consumption and producers could discover APIs and event streaming services for internal inventorying and governance purposes
Enter the Kong Konnect API platform.
Enter the API platform for EDA — powered by the Kong Event Gateway
You might think about Kong Konnect as the API platform for platform builders. You might think about Konnect as the API platform for AI. After all, we often go by both. Because both are true.
Now, we're also the API platform for EDA and real-time data.
Here’s how it works.
Protocol mediation: Expose event broker resources as event API products
There are many different event production and consumption use cases, and some of them require working outside of the native protocol. The first event broker we've introduced Event Gateway support for is Kafka (more are coming, by the way), so we’ll talk in terms of Kafka.
Today, sharing access to real-time data in Kafka is difficult. It often requires manual effort, custom integrations, and/or complex dev work. And it’s very challenging to securely package real-time data as products for external–and potentially monetized–consumption.
The Kong solution: event streams as self-serve API data products
Developers can use the Event Gateway to expose Kafka as event APIs that don’t communicate over Kafka protocol via protocol mediation.
The protocol mediation approach opens up the value of real-time data in Kafka to developers and customers that don’t want to — or can’t — worry about setting up their applications as Kafka clients. Kong Event Gateway customers will be able to expose access to this real-time data as REST APIs and server-sent events APIs to ensure that they can meet developers, partners, and customers where they are.
And, beyond just mediation, you can use the gateway to enforce policies via gateway plugins just like you can when using Kong for traditional REST API use cases.
By productizing real-time data as easy-to-consume data API products, you give developers a simple, secure, self-serve approach to discover, consume, and build on top of real-time data.
Note: While we are spending a lot of time talking about Kafka (this is where we started), support for Solace event broker is coming soon! We plan for Event Gateway to be truly multi-protocol and multi-broker.
Your customers benefit by having easy access to real-time data that they are willing to pay for and/or by simply having a better, real-time experience with the products and services that your organization builds.
Your business benefits, as it becomes easier to ship real-time products faster by equipping devs with self-service access to APIs so that they can register their applications and start consuming real-time data sources to ultimately create real-time customer experiences.
Native Kafka support: Expose Kafka event broker resources as native Kafka services that leverage the native Kafka protocol
While protocol mediation is very useful, especially for data productization use cases and for organizations that are earlier in their Kafka journeys, the native Kafka proxy (where you expose Kafka services over native Kafka protocol, but still via the gateway) is often the chosen method for more Kafka-mature organizations.
As a result, we built a native event proxy into our larger Event Gateway offering.
Technically a separate runtime, the Kong-native event proxy can be used to expose Kafka brokers as native Kafka services via the gateway so that Kafka clients can consume and produce data from and to Kafka brokers. This enables platform and EDA teams to:
- Enforce consistent security standards across both your EDA and API estates via authorization mediation, where you can utilize gold-standard authorization protocols such as OIDC, OAuth2, JWT, etc. for native Kafka exposure use cases
- Move more EDA workloads to the cloud by using the Event Gateway to enforce centralized, local encryption within your network, offloading the burden from your Kafka client and ensuring that unencrypted data never makes it to any sort of cloud-managed event broker environment
- Make more efficient use of Kafka infrastructure through virtualization of clusters and topic filtering for work in lower environments and/or different stages of the development lifecycle
And, of course, all of this will be available as a unified offering within the larger Konnect API platform so that producers can easily spin up Event Gateway infrastructure the same way that they do traditional API infrastructure in Konnect.
Today, the native Kafka proxy can be enabled for any Konnect user, but we are trying a new early access process and enabling access manually for teams who are interested. If you’re interested, please sign up here.
Wrapping up: Events are just the start. The API platform era is here.
The API is at the heart of every innovation story your business cares about, and this includes events. AI value is made real when exposed as secure, governed API services. Microservices drive value when service-to-service communication is driven by consistent, scalable APIs. Your move to hybrid and multi-cloud architectures is really a move to systems that are connected by and built on top of elegant API infrastructure.
And events and real-time data become true data products when packaged as self-serve APIs.
If you’re interested in driving API-led EDA and real-time innovation with the Kong API platform, feel free to book a demo, reach out to your customer success manager, or let us know you’re interested in the early access program for the Kong-native event proxy.
We look forward to seeing what you build.