Blog
  • AI Gateway
  • AI Security
  • AIOps
  • API Security
  • API Gateway
    • API Management
    • API Development
    • API Design
    • Automation
    • Service Mesh
    • Insomnia
    • View All Blogs
  1. Home
  2. Blog
  3. Engineering
  4. Kong CE 0.14 Feature Review – Nginx Injected Directives
Engineering
July 12, 2018
2 min read

Kong CE 0.14 Feature Review – Nginx Injected Directives

Mike Bilodeau
Topics
Automation
Share on Social

More on this topic

eBooks

APIOps: Automating the API Lifecycle with DevOps and GitOps

Videos

Service Catalog with Traceable AI

See Kong in action

Accelerate deployments, reduce vulnerabilities, and gain real-time visibility. 

Get a Demo

As part of our series helping you get up to speed on the new features released in Kong CE 0.14, we want to dive into one of our most exciting and long-awaited features - Dynamic Injection for Nginx Directives. This new feature enables Kong users to easily exercise greater control over their Nginx configurations and eliminate tedious maintenance work to maintain configurations through new Kong releases.

As you are likely aware, Kong ships with a Nginx template that renders when Kong starts. This allows folks to easily get started with Kong, but it also creates challenges for users that want to modify their Nginx configurations. Before this release, there was no mechanism to add or update a Nginx directive within the Nginx.conf used to run Kong. Instead users had to create a custom Nginx template which they needed to update every time that they updated Kong. This created time-consuming maintenance work and the potential for unforeseen issues.

Fortunately, dynamic injection of Nginx directives eliminates these challenges. In CE 0.14 Kong users can now specify any Nginx directive directly in their Kong config file, removing the need to constantly update the Nginx config. To accomplish this, Kong users can specify Nginx directives by using config variables with prefixes, helping to determine the block in which to place a directive.

For Example:

Adding the following line in your kong.conf:

nginx_proxy_large_client_header_buffers=8 24k

will add the following directive to the proxy `server` block of Kong's Nginx configuration file:

large_client_header_buffers 8 24k;

Like all properties in `kong.conf`, this can also be specified via environment variables:

export KONG_NGINX_PROXY_LARGE_CLIENT_HEADER_BUFFERS=8 24k

It is also possible to include entire `server` blocks using the Nginx `include ‘directive. Here is a good example. Our Docker users can simply mount a volume on Kong's container and use the `include` directive to include custom Nginx server blocks or directives.

Kong's method of injecting Nginx directives provides dramatically improved flexibility and ease-of-use for users employing granular control over Nginx.

Benefits include:

  • Changes to Nginx are automatically reflected in Kong
  • Custom Nginx modules work out-of-the-box
  • No tedious maintenance work while upgrading Kong versions
  • Avoid changes to existing code
  • Confidence that new Nginx directives will not break Kong

At Kong, we're committed to Open Source and empowering users to make their own decisions. We know that many of our users are Nginx Ninjas that want to exercise more control over Nginx through Kong, and we are happy to make your lives easier. For users with custom Nginx modules, legacy Nginx configurations, or those that want to experiment with changes to their Nginx config, our 0.14 release will enable you to modify Nginx directives easily and without risk to production.

And of course, thank you to our open source contributors, core maintainers (@hisham@bungle@kikito) and other Kong Inc. employees who all contributed a great lot to this release!

Happy Konging!

Topics
Automation
Share on Social
Mike Bilodeau

Recommended posts

Unlocking API Analytics for Product Managers

Kong Logo
EngineeringSeptember 9, 2025

Meet Emily. She’s an API product manager at ACME, Inc., an ecommerce company that runs on dozens of APIs. One morning, her team lead asks a simple question: “Who’s our top API consumer, and which of your APIs are causing the most issues right now?”

Christian Heidenreich

How to Build a Multi-LLM AI Agent with Kong AI Gateway and LangGraph

Kong Logo
EngineeringJuly 31, 2025

In the last two parts of this series, we discussed How to Strengthen a ReAct AI Agent with Kong AI Gateway and How to Build a Single-LLM AI Agent with Kong AI Gateway and LangGraph . In this third and final part, we're going to evolve the AI Agen

Claudio Acquaviva

How to Build a Single LLM AI Agent with Kong AI Gateway and LangGraph

Kong Logo
EngineeringJuly 24, 2025

In my previous post, we discussed how we can implement a basic AI Agent with Kong AI Gateway. In part two of this series, we're going to review LangGraph fundamentals, rewrite the AI Agent and explore how Kong AI Gateway can be used to protect an LLM

Claudio Acquaviva

How to Strengthen a ReAct AI Agent with Kong AI Gateway

Kong Logo
EngineeringJuly 15, 2025

This is part one of a series exploring how Kong AI Gateway can be used in an AI Agent development with LangGraph. The series comprises three parts: Basic ReAct AI Agent with Kong AI Gateway Single LLM ReAct AI Agent with Kong AI Gateway and LangGr

Claudio Acquaviva

Build Your Own Internal RAG Agent with Kong AI Gateway

Kong Logo
EngineeringJuly 9, 2025

What Is RAG, and Why Should You Use It? RAG (Retrieval-Augmented Generation) is not a new concept in AI, and unsurprisingly, when talking to companies, everyone seems to have their own interpretation of how to implement it. So, let’s start with a r

Antoine Jacquemin

AI Gateway Benchmark: Kong AI Gateway, Portkey, and LiteLLM

Kong Logo
EngineeringJuly 7, 2025

In February 2024, Kong became the first API platform to launch a dedicated AI gateway, designed to bring production-grade performance, observability, and policy enforcement to GenAI workloads. At its core, Kong’s AI Gateway provides a universal API

Claudio Acquaviva

Scalable Architectures with Vue Micro Frontends: A Developer-Centric Approach

Kong Logo
EngineeringJanuary 9, 2024

In this article, which is based on my talk at VueConf Toronto 2023, we'll explore how to harness the power of Vue.js and micro frontends to create scalable, modular architectures that prioritize the developer experience. We'll unveil practical strate

Adam DeHaven

Ready to see Kong in action?

Get a personalized walkthrough of Kong's platform tailored to your architecture, use cases, and scale requirements.

Get a Demo
Powering the API world

Increase developer productivity, security, and performance at scale with the unified platform for API management, AI gateways, service mesh, and ingress controller.

Sign up for Kong newsletter

Platform
Kong KonnectKong GatewayKong AI GatewayKong InsomniaDeveloper PortalGateway ManagerCloud GatewayGet a Demo
Explore More
Open Banking API SolutionsAPI Governance SolutionsIstio API Gateway IntegrationKubernetes API ManagementAPI Gateway: Build vs BuyKong vs PostmanKong vs MuleSoftKong vs Apigee
Documentation
Kong Konnect DocsKong Gateway DocsKong Mesh DocsKong AI GatewayKong Insomnia DocsKong Plugin Hub
Open Source
Kong GatewayKumaInsomniaKong Community
Company
About KongCustomersCareersPressEventsContactPricing
  • Terms•
  • Privacy•
  • Trust and Compliance•
  • © Kong Inc. 2025