Kong
NormalDescription
The cloud-native API and AI Gateway providing LLM request routing, rate limiting, load balancing and observability for AI agent applications.
The cloud-native API and AI Gateway providing LLM request routing, rate limiting, load balancing and observability for AI agent applications.
A CNCF Sandbox SRE Agent that automatically analyzes infrastructure logs and metrics to assist with incident diagnosis and system operations.
End-to-end, code-first tutorials for building production-grade GenAI agents. From prototype to enterprise deployment.
CLI that hooks into your Git workflow to capture AI agent sessions as you work — sessions are indexed alongside commits, creating a searchable record of how code was written in your repo.
What are the principles we can use to build LLM-powered software that is actually good enough to put in the hands of production customers?