A trust and control layer for proxying traffic to MCP servers
Gain control and visibility over AI agent infrastructure with AI Gateway-driven MCP capabilitiesBring MCP servers to production securely with Kong AI Gateway
AI agents are rapidly becoming core components of modern software, driving the need for structured, reliable interfaces to access tools and data. The Model Context Protocol (MCP) addresses this by enabling agents to reason, plan, and act across services. However, scaling MCP in remote, distributed environments introduces new operational challenges.
Kong AI Gateway enables teams to manage remote MCP traffic with enterprise-grade security, performance, authentication, context propagation, load balancing, and observability.
Learn how to:
MCP server options
Kong Gateway supports two ways to integrate MCP functionality:
- Autogenerate MCP tools from APIs: Autogenerate secure, serverless MCP endpoints directly from any API schema without needing an LLM model.
- Connect external MCP endpoints with LLM models and AI Proxy: Proxy remote or hosted MCP endpoints that call LLM models while enforcing control at the edge.
Autogenerate MCP servers using AI MCP Proxy
Automatically turn any API into a secure MCP server using the AI MCP Proxy plugin. This approach does not require an LLM and provides full control over production workloads.
Considerations for production use:
- Security and compliance can be fully managed since endpoints run under your control.
- Traffic can be monitored and scaled using Kong Gateway features.
- Costs are predictable because you control the underlying services.
Use Kong Gateway plugins to:
- Secure access with the AI MCP OAuth2 plugin or other authentication methods.
- Govern usage with rate limiting and traffic control Kong Gateway plugins.
- Monitor behavior using Kong Gateway logging and monitoring tools.
- Integrate APIs directly into MCP workflows and AI assistants.
Connect external MCP servers with LLM models and AI Proxy
Expose any remote MCP server that calls LLM models through the AI Proxy plugin, enforcing observability, and security at the edge.
Considerations for production use:
- Security, compliance, and data handling must be assessed for external MCPs.
- Latency, reliability, and versioning depend on the external LLM provider.
- Cost can grow quickly depending on request volume and model pricing.
Use Kong AI Gateway plugins to:
- Secure access with Kong Gateway plugins.
- Govern usage with AI rate limiting and AI guardrails.
- Enforce load balancing based on tokens, cost, or LLM accuracy.
- Monitor behavior using logging and monitoring tools.
Autogenerate MCP tools from any API using AI MCP plugins
Explore guides and examples to auto-generate MCP servers and tools without custom code.
Secure and govern your MCP traffic via AI Proxy
Follow the tutorials below to learn how to secure, govern, and observe your MCP traffic using Kong AI Gateway and AI Proxy.
Kong also provides a built-in MCP server that connects directly to your Kong Konnect Control Planes. It offers read-only tools for analytics, configuration inspection, and Control Plane metadata—ideal for AI-driven workflows with Claude or other compatible assistants.
With Kong Konnect MCP server, you can use natural language to:
- Query API traffic across gateways with filters and time windows.
- List and inspect Services, Routes, Consumers, and plugins.
- Explore Control Plane hierarchies and group relationships.
- Build and test workflows without a production setup.
MCP traffic observability
Kong AI Gateway records detailed Model Context Protocol (MCP) traffic data so you can analyze how requests are processed and resolved.
- Logs capture session IDs, JSON-RPC method calls, payloads, latencies, and errors.
- Metrics track latency, response sizes, and error counts over time, giving you a complete view of MCP server performance and behavior.