Engineering
November 3, 2022
3 min read

Transforming Kong Logs for Ingestion into Your Observability Stack

Damon Sorrentino

As a Solutions Engineer here at Kong, one question that frequently comes across my desk is “how can I transform a Kong logging plugin message into a format that my insert-observability-stack-here understands, i.e. ELK, Loki, Splunk, etc.?” In this blog, I’m going to show you how to easily accomplish converting a Kong logging payload to the Elastic Common Schema.

In order to accomplish this task, we’re going to be running Kong Gateway in Kubernetes and using two Kong plugins.

  1. Serverless Pre-function
  2. File Log

If you don’t already have an instance of Kong running in a Kubernetes cluster, connect to your cluster and run the following the commands to get one in seconds.

See how to install Kong in Kubernetes for more information. Once you have an available instance of the Kong Gateway, continue.

First, create an empty Kubernetes manifest file called, elastic-common-schema.yaml.

Next, let’s define our KongPlugin resources. The first plugin we will create is the serverless pre-function. From the Kong plugin docs, a serverless pre-function plugin:

Runs before other plugins run during each phase. The pre-function plugin can be applied to individual services, routes, or globally.

Since we’re logging, we’re concerned with the log phase or “context”. For more information on all available plugin contexts, read this doc.

Paste the below yaml in your manifest.

The above resource definition creates a KongPlugin that executes before the logging phase of each plugin defined in scope. The kong.ctx.shared.mystuff=kong.log.serialize() is a single line of Lua code that stores the logging payload into a shared context. From the Kong docs, a shared context is:

A [Lua] table that has the same lifetime as the current request. This table is shared between all plugins. It can be used to share data between several plugins in a given request.

For more info on shared contexts, see this doc.

The key to doing the transformation is the custom_fields_by_lua configuration. From the Kong docs, the custom_fields_by_lua is:

A list of key-value pairs, where the key is the name of a log field and the value is a chunk of Lua code, whose return value sets or replaces the log field value.

which gets transformed into:

Congratulations, you have transformed a Kong log payload into an Elastic Common Schema format ready for ingestion! This pattern can be used to easily transform Kong logging messages into any format for ingestion with any observability stack.

Full API Observability Unveiled: Gain Complete Visibility with Konnect