Kong Konnect: Introducing HashiCorp Vault Support for LLMs
If you're a builder, you likely keep sending your LLM credentials on every request from your agents and applications. But if you operate in an enterprise environment, you'll want to store your credentials in a secure third-party like HashiCorp Vault or IDP provider and have the infrastructure inject the credentials for you dynamically. By doing so, you'll have:
✅ No leaks of credentials into agents
✅ Strong governance of LLM credentials in secure storage
✅ Ability to switch credentials on the fly without having to redeploy your agents
In this case, the infrastructure that secures the communication between the agents and the LLMs is an AI gateway, and the capability that allows you to manage and configure this behavior is our Kong Konnect AI Manager.
Today, we're introducing the ability to configure dynamic authentication using a third-party like HashiCorp Vault in our AI Manager product.
Lean agents, powerful AI intelligence
As we ramp up the creations of AI-powered agents, it's important to keep them lean and focused while we delegate as much as we can to our infrastructure.
Kong’s AI Gateway does just that. It removes 50+ capabilities that we usually would have built ourselves — like observability, security, semantic acceleration, RAG pipelines, and more — and allows us to instead focus on our agentic business logic and intelligence by leveraging all these capabilities out of the box from the AI Gateway itself.
One of these capabilities is certainly authentication to the models themselves:


Get started today
You can get started with Kong AI Gateway and AI Manager by signing up on Konnect.
Kong AI Gateway: Multi-LLM Adoption Simplified. AI-Native Gateway for governance & control.
