Register for API Summit 2025 now!
Why event productization is the next frontier of EDA
We’ll be talking all about how teams can use the Event Gateway for true event data productization at Summit.
But first, what is event data, or event API productization?
Event API productization entails transitioning from a service-centric to a consumer-centric view of streams, where each stream is a secure, well-defined product.
An Event Data Product must have:
- A Defined Lifecycle: Versioning, deprecation, and retirement, just like a REST API.
- Clear Ownership: Championed by a product manager, not an admin.
- High Discoverability: Published alongside other APIs in a central catalog.
- Measurable Usage: Consumption is easily monitored and controlled.
Without a centralized control point, managing these data products results in a governance gap.
- Inconsistent Security: Teams implement and maintain their own security logic (ACLs, client certificates), causing configuration drift and vulnerabilities.
- Lack of Control: No easy way to enforce standards, quality of service (QoS) consumption for specific consumers, or track usage for chargebacks.
- Fragmented DevEx: Developers must use disparate tools, documentation, and authentication methods for sync and async workloads, slowing time-to-value of the same application.
The Kong Event Gateway is designed to close this gap by implementing consistent controls at the Gateway layer so that they are easier to standardize across the business.
Kong Event Gateway as your event productization engine
Kong Event Gateway, along with the rest of the Konnect platform, enables your organization to transform what you're already building (or plan to build) with Kafka into self-service, monetizable data products.
It starts with the Gateway, which serves as a central control point for Kafka clusters and topics, exposing them as virtual clusters with consistent security, reliability, and performance policies managed at the Gateway layer rather than the client-side.
These policies include:
- Authentication Mediation (JWT, OAuth2, API Key): decouple authentication from the broker. Kong validates client credentials and mediates secure access to Kafka.
- Access Control Lists (ACL): centrally enforce fine-grained access (e.g., consumer A can read topic-X, but not topic-Y), rather than relying on complex Kafka ACLs in each cluster
- Data Structure: avoid issues like poison pills before submitting to your broker for schema validation and rejection if not compliant
- Schema Validation: validate incoming messages against schemas (Avro, Protobuf, JSON Schema) at the edge, protecting downstream consumers from bad data.
- Message Encryption/Decryption: enhance compliance by centralizing encryption/decryption policy management at the gateway, removing the burden from client applications.
- Header Modification: inject or modify headers for operational needs, like adding correlation IDs for distributed tracing.
With this kind of central control point, you're free to start opening up access to your event broker event data and publishing it as self-service AsyncAPI products in your Konnect Developer Portal. And who knows, maybe you’ll even be able to use OpenMeter to monetize that data…
You’ll have to learn more at API Summit!
Learn more at API Summit 2025
Want to see Event data productization in action? Join us at API Summit, where Kong will unveil the latest in Kong Event Gateway and demonstrate how it can provide a strong foundation for your broader API and event streaming strategy.
Register for the API Summit today!