Blog
  • AI Gateway
  • AI Security
  • AIOps
  • API Security
  • API Gateway
|
    • API Management
    • API Development
    • API Design
    • Automation
    • Service Mesh
    • Insomnia
    • View All Blogs
  1. Home
  2. Blog
  3. Product Releases
  4. Introducing LLM Analytics in Kong Konnect for GenAI Traffic
Product Releases
September 11, 2024
4 min read

Introducing LLM Analytics in Kong Konnect for GenAI Traffic

Christian Heidenreich
Senior Staff Product Manager, Kong

We’re pleased to announce the new LLM Usage reporting feature in Advanced Analytics, which aims to help organizations better manage their large language model (LLM) usage. This feature offers insights into token consumption, costs, and latency, allowing businesses to optimize their AI investments. By enabling comparisons across different LLM providers and models, organizations can make better-informed decisions and improve budget management.

advanced analytics dashboard

The increasing adoption of AI and its challenges

As businesses increasingly integrate AI into their operations, they encounter a variety of challenges. While AI offers substantial benefits, such as improved efficiency and enhanced customer experiences, the rapid adoption also brings complexities that organizations must navigate.

  • Visibility: Many organizations struggle to gain a clear understanding of how AI, particularly LLMs, is being utilized. This lack of visibility can lead to inefficiencies and missed opportunities for optimization.
  • Cost management: With the growing use of AI technologies in your organization, managing costs becomes critical. Organizations must monitor token usage and associated expenses to ensure they're maximizing their investment in AI while minimizing unnecessary spending.
  • Compliance and governance: Legal and compliance teams can ensure that LLM usage aligns with organizational policies and industry regulations by monitoring usage patterns and costs. 

Analyze AI traffic with Konnect Advanced Analytics

With the release of Kong API Gateway 3.8, organizations can now leverage Konnect’s Advanced Analytics to get insights into AI traffic. By providing a comprehensive view of their AI usage, this new feature empowers organizations to make data-driven decisions, optimize costs, and improve the overall performance of their AI-powered solutions.

Here are a few example use cases where you can use Advanced Analytics for AI traffic:

  • Using Advanced Analytics, organizations can compare cost and response times of various AI providers, enabling data-driven decisions on which provider to use.
  • By leveraging detailed records of AI requests, organizations can create comprehensive audit trails.
  • Using Advanced Analytics, organizations can track AI usage and costs per department, enabling fair allocation of AI resources and budgeting based on actual usage and value generated.
  • By comparing performance metrics (latency, token usage) across different versions of AI models, organizations can make informed decisions about rolling out updates or rolling back to previous versions if issues are detected.

Users can now toggle between API Usage and LLM Usage views in Explorer, allowing for tailored metrics, groupings, and filter options specific to AI traffic analysis. The LLM Usage feature introduces critical new metrics, including LLM Latency, Costs, Prompt Tokens, and Completion Tokens, while advanced filtering and grouping options now encompass Provider, Request Model, and Response Model, enabling granular analysis.

Additionally, API Requests now displays detailed LLM Insights for AI-related traffic that has been proxied through the Kong AI Gateway, providing a comprehensive audit trail with rich metadata.

How to get started

Head over to Kong Konnect and navigate to "Analytics" on your left nav bar. Go to "Explorer" and you'll see a new “From” selector at the top. Switch from "API Usage" to "LLM Usage."

This new switch helps you focus on one dataset at a time — either API or LLM usage. Since these datasets have different characteristics and metrics, separating them reduces information overload and makes analysis easier.

Once you select "LLM Usage," you'll see new LLM-specific metrics and dimensions, such as token usage and model types. This view is tailored to help you understand and optimize your LLM operations within the familiar Explorer interface.

Now let's explore token usage across different providers. This analysis gives you valuable insights into potential costs and helps identify which providers are more cost-effective. To do this:

  • Pick a suitable visualization, such as a bar chart.
  • Select “Total Token Count” as your metric.
  • Choose “Provider” as your group by dimension.

This view will clearly display total token usage per provider, allowing for quick comparisons. Keep in mind that higher token usage typically means higher costs, making this information crucial for optimizing your LLM expenses and provider selection.

To add another layer of analysis, include the Control Plane dimension. This will break down LLM usage by environment, team, or line of business. By doing so, you'll gain a comprehensive view of how different parts of your organization are utilizing LLMs. This analysis empowers you to make data-driven decisions about resource allocation and optimization across your entire ecosystem, potentially uncovering opportunities for cost savings or areas needing additional resources.

Start using Advanced Analytics in Kong Konnect Plus today and unlock the full potential of your AI traffic data, empowering your organization to make informed decisions, optimize costs, and enhance performance across all AI initiatives.

Manage LLM Costs & Observability

Developer agility meets compliance and security. Discover how Kong can help you become an API-first company.

Get a DemoStart for Free
API AnalyticsAIAI GatewayAIOpsLLM

More on this topic

eBooks

The AI Connectivity Playbook: How to Build, Govern & Scale AI

Videos

API Cost Management in the Age of LLMs

See Kong in action

Accelerate deployments, reduce vulnerabilities, and gain real-time visibility. 

Get a Demo
Topics
API AnalyticsAIAI GatewayAIOpsLLM
Share on Social
Christian Heidenreich
Senior Staff Product Manager, Kong

Recommended posts

Announcing Kong AI Gateway 3.8 With Semantic Caching and Security, 6 New LLM Load-Balancing Algorithms, and More LLMs

Kong Logo
Product ReleasesSeptember 11, 2024

Today at API Summit , we're introducing one of the biggest new releases of our AI Gateway technology : a new class of intelligent semantic plugins, new advanced load balancing capabilities for LLMs, and the official support for AWS Bedrock and GCP

Marco Palladino

Move More Agentic Workloads to Production with AI Gateway 3.13

Kong Logo
Product ReleasesDecember 18, 2025

MCP ACLs, Claude Code Support, and New Guardrails New providers, smarter routing, stronger guardrails — because AI infrastructure should be as robust as APIs We know that successful AI connectivity programs often start with an intense focus on how

Greg Peranich

Consistently Hallucination-Proof Your LLMs with Automated RAG

Kong Logo
EnterpriseApril 2, 2025

AI is quickly transforming the way businesses operate, turning what was once futuristic into everyday reality. However, we're still in the early innings of AI, and there are still several key limitations with AI that organizations should remain awa

Adam Jiroun

AI Voice Agents with Kong AI Gateway and Cerebras

Kong Logo
EngineeringNovember 24, 2025

Kong Gateway is an API gateway and a core component of the Kong Konnect platform . Built on a plugin-based extensibility model, it centralizes essential functions such as proxying, routing, load balancing, and health checking, efficiently manag

Claudio Acquaviva

Kong Konnect: Introducing HashiCorp Vault Support for LLMs

Kong Logo
Product ReleasesJune 26, 2025

If you're a builder, you likely keep sending your LLM credentials on every request from your agents and applications. But if you operate in an enterprise environment, you'll want to store your credentials in a secure third-party like HashiCorp Vault

Marco Palladino

Kong AI Manager: Govern & Observe Agentic Traffic to Thousands of LLMs

Kong Logo
Product ReleasesMay 27, 2025

Today, we're excited to announce the general availability of AI Manager in Kong Konnect, the platform to manage all of your API, AI, and event connectivity across all modern digital applications and AI agents. Kong already provides the fastest and m

Marco Palladino

API Gateway vs. AI Gateway

Kong Logo
Learning CenterNovember 3, 2025

The Gateway Evolution An unoptimized AI inference endpoint can burn through thousands of dollars in minutes. This isn't hyperbole. It's the new reality of artificial intelligence operations. When GPT-4 processes thousands of tokens per request, tradi

Kong

Ready to see Kong in action?

Get a personalized walkthrough of Kong's platform tailored to your architecture, use cases, and scale requirements.

Get a Demo
Powering the API world

Increase developer productivity, security, and performance at scale with the unified platform for API management, AI gateways, service mesh, and ingress controller.

Sign up for Kong newsletter

    • Platform
    • Kong Konnect
    • Kong Gateway
    • Kong AI Gateway
    • Kong Insomnia
    • Developer Portal
    • Gateway Manager
    • Cloud Gateway
    • Get a Demo
    • Explore More
    • Open Banking API Solutions
    • API Governance Solutions
    • Istio API Gateway Integration
    • Kubernetes API Management
    • API Gateway: Build vs Buy
    • Kong vs Postman
    • Kong vs MuleSoft
    • Kong vs Apigee
    • Documentation
    • Kong Konnect Docs
    • Kong Gateway Docs
    • Kong Mesh Docs
    • Kong AI Gateway
    • Kong Insomnia Docs
    • Kong Plugin Hub
    • Open Source
    • Kong Gateway
    • Kuma
    • Insomnia
    • Kong Community
    • Company
    • About Kong
    • Customers
    • Careers
    • Press
    • Events
    • Contact
    • Pricing
  • Terms
  • Privacy
  • Trust and Compliance
  • © Kong Inc. 2026