Kong for AWS: How Cargill Uses KongIn support of our partnership with AWS, we’re re-examining how some of our major customers today are using Kong with AWS. To this end, let’s look at our popular case study of Cargill’s use of Kong, through the lens of how AWS is a critical part of this story.
For over 150 years, Cargill has been on a mission to nourish the world in a safe, responsible
and sustainable way. Headquartered in Minneapolis, Minnesota, Cargill combines extensive
supply chain experience with new technologies and insights to serve as a trusted partner for
food, agriculture, financial and industrial customers across more than 125 countries. One of the
largest private companies operating today, Cargill employs over 150,000people worldwide and
handles logistics for over a third of the world’s food supply.
As an industry leader, Cargill has always recognized that enabling rapid innovation is critical to
maintaining a competitive advantage. However, Cargill’s legacy IT systems were slowing its
ability to create new digital products and services to address the evolving needs of its
customers and partners. In response to these new digital imperatives, Cargill began the process
of transforming its internal suite of APIs to be more dynamic, thereby enabling consumption
across the entire company.
With the goal of bringing more digital experiences to stakeholders, the Cargill API platform team
came up with a plan to increase agility and transition from siloed legacy IT architecture to a
modern, microservices architecture. Kong is lightweight and fast enough to run at the edge, with
Kubernetes or in a mesh, which fits Cargill’s vision perfectly. Together with Kong, Cargill
achieves the low latency needed to deploy high-performance, cloud-native microservices and
the flexibility to seamlessly integrate them across all of its systems.
In addition to superior performance and flexibility, Cargill used Kong to simplify management
and monitoring across its highly complex architecture. Kong’s out-of-the-box management tools
and open source ecosystem provide Cargill with robust capabilities to simplify management of
APIs across their entire lifecycle, that allowed it to bring more services to market more quickly
and helps manage them better once they’re in production.
The second phase of the Cargill team’s vision was to decentralize the platform to facilitate
dynamic scalability. A major component of this was utilizing cloud native design patterns,
and Kong’s native support for Kubernetes was a critical factor in enabling the Cargill team to
easily scale up and down with demand.
Cargill uses AWS EKS/Kubernetes for all cloud deployments. It deploys Kong through its
pipeline using the Kubernetes ClusterIP type, but allows ingress from the outside to Kong, and
then Kong brokers incoming requests to the upstream APIs. Upstreams are defined within Kong
as Kubernetes Services so Kong only needs to be aware of the single Service name of each
Everything in the stack is managed by AWS: EKS, RDS, Route 53. API teams are allowed to
use whichever AWS capability they wish including DynamoDB, MySQL, SQS, and the bulk of
Kong activity is managed via Kubernetes.
Results of using Kong with AWS at Cargill have included up to 65x faster deployment times with
automated validations; launching more than 450 new digital services in six months; and offering
dynamic infrastructure that auto-scales up and down with demand.