Kong Announces Support for Service Mesh Deployments
Industry-Leading Flexibility Enables Kong to Work Seamlessly Within Service Mesh Deployments
SAN FRANCISCO -- August 28, 2018 -- Kong Inc., the leading API platform for modern architectures, today announced that its open source platform will support service mesh deployments. Offering lightweight processes, reduced latency and faster performance, the new service mesh capabilities will enable Kong users to better manage increased East-West network traffic within their microservice-oriented architectures. Users will have the flexibility to use Kong as a standalone service mesh or to integrate it with service mesh players such as Istio and others in the ecosystem. Industry-best performance in a deployment-agnostic platform makes Kong the only API platform well-suited for modern architectures and the solution of choice for developers, DevOps pros and solutions architects.
As organizations increasingly transition towards flexible, scalable and reusable microservices, they create more network traffic as services access one another through APIs over the network. Traditional API gateways were designed to sit at the edge of the data path in between a client and a monolith, where the latency they introduced had little impact. Now, microservices and service mesh architectures are held together by APIs, and processing latency associated with increased traffic between services can become unacceptably high with traditional gateways.
Kong designed its platform to handle modern deployment patterns, enabled by the advent of containers, to address the increased traffic and latency. Where older API management platforms may introduce about 200 milliseconds of processing latency between services in a container ecosystem, Kong creates less than 10 milliseconds of delay. The flexibility provided by Kongs plugin architecture further enhances latency performance by removing unnecessary functionality and supporting seamless integrations with ecosystem participants.
Kong is the best API management platform to address the challenges of data in-flight in distributed architectures because we designed it to be lightweight, low latency, extensible and deployment-agnostic, said Marco Palladino, co-founder and CTO of Kong. Older API solutions cant keep up as the topology of network traffic changes from external to largely internal. Top enterprises trust Kong, and DevOps teams have made it the most popular open source API gateway because its ideally suited to supporting both legacy and modern architectures. No other platform can deliver on this promise.
Kong thought leaders will also discuss using Kong in service mesh network architectures, in serverless and other cloud-native environments, and in on-prem or hybrid deployments at the Kong Summit on September 18-19 at The Pearl in San Francisco. The event will bring together technologists from the Kubernetes and other open source communities, venture capitalists, enterprise software developers and architects, and members of the Kong user community. Register for Kong Summit or visit https://konghq.com/kong-summit/ to learn more.
Kong delivers a next-generation API platform designed for modern architectures, including microservices, containers, cloud and serverless. Offering high flexibility, scalability, speed and performance, the open source platform enables developers and Global 5000 enterprises to reliably secure, connect and orchestrate microservice APIs for modern applications. For more information about Kong, please visit https://konghq.com/ or follow @thekonginc on Twitter.