Kong Enterprise 3.2 delivers enhanced reliability, security and visibility of your APIs and microservices. Learn more

Case Studies

Rush Builds a Scalable Infrastructure for the Future with Kong Enterprise

Get Started with Kong

Results

Standardization

Standardized specifications across over 600 API endpoints

Develop Faster

Increased time-to-market and faster, higher quality releases

Deploy Anywhere

Simplified automation and pipeline checks

Solutions:

  1. Utilized Kong as code in the infrastructure, including redirection all services to the Kong Gateway endpoint
  2. Migrated microservices to Kong and updated CI/CD process for automated endpoint publishing
  3. Moved authentication and authorization to Kong, rather than being part of the application layer

Challenges:

  1. Inability to scale a monolithic infrastructure
  2. Security concerns in accessing unlisted endpoints
  3. Difficulty maintaining codebase, which increased release time due to interdependencies
  4. A centralized place for authorization and authentication for all users across the platform

Delivering an outstanding post-purchase customer experience

Rush is a post-purchase Shopify-based shipment tracking software used by more than 1,500 stores on Shopify. With Rush’s customizable tracking page, order delivery notifications, shipping reports, and analytics, the company has helped customers increase revenue and retention by providing a seamless shopping experience.

Like many startups, their top priority was to develop and release their product, but they found that making fast pivots and growing their monolith came with its own challenges.

“At a certain point, adding new features, debugging problems, and maintaining comes at a price,” said Stanislav Stankov, Product Lead. “Especially if you, like us, deal with big data, you will soon start hitting the problem of scalability: how can the system be scaled in a reliable and cost-efficient way?”

Migrating from monolith to microservices

Stankov knew they needed to move to an API gateway to support this monolith-to-microservices migration, but they had specific requirements in what they were looking for, including:

  • A large selection of out-of-the-box plugins (and the ability to extend with custom plugins)
  • A place to manage all APIs and documents (like Dev Portal)
  • The flexibility to deploy on any cloud provider

That requirement list helped them narrow their solutions down to AWS API Gateway and Kong Enterprise. The team at Rush ultimately chose Kong because it would allow them to use a multi-cloud approach.

Reduce and Mitigate Security Risks of external Open APIs

“While we currently use AWS, we wanted the freedom to change that in the future,” Stankov said. “We understand that as we grow and scale, our cloud provider needs may shift, and Kong gives us the ability to do that without disrupting our API ecosystem.”

The Kong implementation process

After selecting Kong as their API gateway provider, the team at Rush dove into the preparation and implementation process.

Their first step was to create their full roadmap, working with the Kong field engineering team to develop their plan and understand all of the requirements and specifications needed to have a successful implementation. Once they had all of the required endpoints, they simply applied it to their current infrastructure.

The team at Rush was already using Kubernetes with setup production and staging clusters, so the actual implementation was fairly simple. They created a cluster specifically for Kong and replicated the product environment, then added Kong API Gateway as code in the infrastructure, pointing to a single microservice.

Then they began the microservices migration, which allowed them to reduce unwanted exposure. The team at Rush also updated their CI/CD process for automated endpoint publishing.

Finally, they migrated their authentication and authorization to Kong, so neither remained with the application layer.

“To implement an API gateway, you need to be at a certain point of organizational maturity. We needed to work on creating transparency and standardizing through documentation all of the different things our development team does,” Stankov said. “While going through this process, we actually found our development and DevOps teams increasing in their maturity. Both teams started to communicate and understand each other better. We standardized a lot, but we also gained more knowledge on how to scale and split services more efficiently.”

An improved customer and developer experience through an API-centric approach

Rush is still in the early stages of their API journey, but Stankov and the team have already seen significant benefits in their scalable API-based infrastructure strategy.

While hard to quantify, one of the biggest benefits Stankov has seen so far is in the knowledge and increased maturity in the mindset of the team. This sets them up for even greater success going forward.

“One of the most substantial changes throughout this process has been around how the team thinks,” Stankov said. “Not only about code based implementation, but also other questions: How will this feature scale? How will we monitor it? How can it be written for ease of maintenance?”

The team at Rush has also seen significant improvement in the standardization of their API specifications. Whereas they were previously using a wide range of URL names and patterns, HTTP status codes, and response patterns, they now have a set of specifications to work from. The result has been that they’ve created more intuitive APIs for consumption, leading to faster adoption.

“Now, all of our 600+ API endpoints seem like they were written by a single person, making them much easier to use,” Stankov said. This will lead to easier future adoption of Rush APIs from 3rd party developers, as soon as they are ready.

Securing sensitive customer data

Another benefit of their new API gateway-led approach has been around security. Rush deals with a large amount of customer data — including PII, order value, and buying habits — and the team needed a way to approach storing this information with care.

Rush has also implemented the Kong Authentication and Authorization layer, which means that calls only enter the internal system if they are approved, and can access all authorized information.

“Kong helps us create a single endpoint for access to our internal microservices — and only for the endpoints we actually want to expose,” Stankov said. “This has greatly reduced the calls to the system from automated bots that are scanning our services daily for holes targeting the framework stack we use. It also means that we are only exposing the information we want to expose.”

Keeping services operational with zero-downtime deployment

Rush first decided to invest in an API gateway because Stankov and the team knew that if they stayed in a monolith, they wouldn’t have the ability to scale their system. This could have led to an impact on customers, who would start to notice unreliable services.

“We serve over 1,000 merchants on a daily basis, and because we provide services for their customers, we serve over 2,000,000 unique customers on a monthly basis. Our total API outgoing traffic per month is over 400 GB. Keeping services operational with zero-downtime deployment is not only mandatory, but failing to do so led to a negative business impact in customer churn,” Stankov said. “With Kong, we are able to deploy more frequently and add client value faster because there are fewer dependencies in our way.”

A foundation for future growth and scalability

“The benefit we have gained in moving to Kong Enterprise is tremendous,” Stankov said. “We knew we needed to invest in architecture and scalability and we chose Kong as the right time. In doing so, we laid the bedrock for the future growth of the business, as well as the scalability of the dev team.”

– Stanislav Stankov, Product Lead, Rush

Maximizing API Security and Extensibility: Best Practices

Learn how to unlock the full potential of your API infrastructure

March 28, 2023 8:00 AM (PT) Register Now