June 7, 2023
5 min read

Kong Konnect: A Developer’s Guide

Taylor Page
Viktor Gamov

In this guide, Viktor Gamov (Principle Developer Advocate at Kong) will dive into the power of Kong Konnect, the SaaS managed control plane for Kong Gateway designed for seamless API management. We'll explore how to use Kong Konnect to configure external services, enable application registration, configure Dev Portal, use credentials to manage services access, and replicate configurations across different environments with just a few clicks. Let's get started!

4 Ways to Deploy Kong Gateway

There are many ways to deploy Kong Gateway, but the four main ones are DB-less mode, traditional mode, hybrid mode, and with Kong Konnect. We detail some of those differences in this blog post but here, we'll just focus on Kong Konnect.

With Kong Konnect, since it's a SaaS offering, you don't need to maintain a database or manage the control plane. It also introduces a feature called Runtime Groups, which allows you to segregate and scale different environments (staging, QA, prod, etc) as well as different regions (US-based, Europe-based, to comply with GDPR). The fully managed control plane will be responsible for shipping out configuration to your data planes. Because it is multi-tenant and fully managed, it allows for heterogeneous deployment.

Using the ChatGPT API to Demo Kong Konnect

Most people are familiar at this point with ChatGPT, the OpenAI API, but at its core, ChatGPT is a text completion tool based on a large-scale learning model. There are a number of models (Ada, Babbage, Curie, and DaVinci) that have different relative capabilities and associated cost, with Ada having the lowest cost and DaVinci having the highest cost and the relatively highest capabilities. As an example, if you ask the Ada model "what is Kong?", the response is "Kong is a computer game played by zones.", whereas if you ask the DaVinci model, the response is "Kong is an open source API management platform that provides reliable way to consume APIs." For this demo, we'll use the DaVinci and the Curie models of the OpenAI API and we'll want to expose them and provide a way for developers to consume them. Most importantly, we want to restrict them. To do this with Kong Konnect, we'll start by creating a service.

Creating a Service in Kong Konnect

The first step is to create a Kong Konnect account - you can sign up here. Kong Konnect is available in a free version, which allows you access to everything you need to get started managing your services. When we log into Konnect for this demo, we're going to make sure our region is set to North America and we'll go into our Runtime Manager to see our runtime instances. This is our dataplane, or the instances of Kong Gateway. In this demo, we have an instance of Kong Gateway running, but it's not yet configured. Next, we'll go into Service Hub to create a new service. We'll use the display name OpenAI and then we can push configuration from the Konnect control plane to our data plane. We have a default runtime group that is set up as soon as we register for Konnect. In this demo, we're going to use version one of the OpenAI service and will create an implementation. First, we'll create the service, then the route. With OpenAI, we can use Kong Gateway functionality to create a model parameter so the route knows which language model to use (DaVinci or Curie). To do this, since we want it to be simple to use, we'll use a plugin.

Using Plugins with a Kong Konnect Service

We have many plugins available to help extend Kong Konnect's abilities - from analytics and monitoring to authentication to traffic control. For our current demo, we're going to use the plugin called Request Transformer, which will modify the request before it hits the upstream server. For example, for the DaVinci model route, we'll append the body to include the language model name. We can also create a route for the Curie language model and use the same Request Transformer plugin, so we'll have two routes inside this gateway service.

Plugins can also be globally configured. If we want to apply access limits to this service, we can use the Rate Limiting plugin to control how many HTTP requests a developer can make in a specified time period. For this OpenAI service, we'll limit 10 requests per 60 seconds. This will apply to both the DaVinci and the Curie route. Rate Limiting is an easy way to control traffic, so it's no surprise that it's one of our most popular plugins.

Now that we have the OpenAI service configured with our two routes and our plugins, we want to make it available to find and use, so we're going to publish it on our Dev Portal.

Publishing Services to the Dev Portal

All we have to do to publish a service on the Dev Portal is to go to the OpenAI service inside our Service Hub on Kong Konnect and publish it on our Dev Portal. You can customize the Dev Portal (including the URL and appearance) through the Konnect UI. In this demo, we can see that the OpenAI service is available to be used and we can register applications with this. By enabling Application Registration, developers can use this API. I want to be able to limit access to this service too, so for the DaVinci route, we'll limit access based on credentials using App Registration. This automatically creates consumer credentials in Dev Portal. We can also create Consumer Groups and attach different specifications for the consumer groups. For the Rate Limiting configuration, as an example, we can set different limits for our Gold Consumer Group. Consumer Groups allow you to get very specific in terms of the access and usage abilities for different groups, like Production and Testing.

Replicating a Service in a New Environment

Kong Konnect currently is available for both the US (North America) and EU (Europe) regions. This makes Kong Konnect a great solution for a distributed service architecture so you can manage your various Konnect instances with one Control Plane. You can also restrict roles and access for different regions. Replicating our current US region in the EU region takes just a few clicks. We have the configuration already in our US Runtime Group, so we just need to do a decK dump and a decK sync. This will replicate the config file and create them in the EU environment. The global rate limiting we applied is still in place and the new region is all set up and ready to go.

Thanks for Joining!

We hope you enjoyed this developer's guide to Kong Konnect using the OpenAI API. If you haven't already, sign up for Konnect and try it out yourself! And if you have any questions, check out the Kong Konnect docs.