Related Documentation
Minimum Version
Kong Gateway - 3.6
Tags
#ai

You can proxy requests to Anthropic AI models through AI Gateway using the AI Proxy and AI Proxy Advanced plugins. This reference documents all supported AI capabilities, configuration requirements, and provider-specific details needed for proper integration.

Upstream paths

AI Gateway automatically routes requests to the appropriate Anthropic API endpoints. The following table shows the upstream paths used for each capability.

Capability

Upstream path or API

Chat completions /v1/messages
Completions /v1/complete
Function calling /v1/messages
Batches /v1/messages/batches

Supported capabilities

The following tables show the AI capabilities supported by Anthropic provider when used with the AI Proxy or the AI Proxy Advanced plugin.

Set the plugin’s route_type based on the capability you want to use. See the tables below for supported route types.

Text generation

Support for Anthropic basic text generation capabilities including chat, completions, and embeddings:

Capability

Route type

Streaming

Model example

Min version

Chat completions llm/v1/chat Supported claude-sonnet-4-20250514 3.6
Completions llm/v1/completions Not supported claude-sonnet-4-20250514 3.6

Advanced text generation

Support for Anthropic function calling to allow Anthropic models to use external tools and APIs:

Capability

Route type

Model example

Min version

Function calling llm/v1/chat claude-sonnet-4-20250514 3.6

Processing

Support for Anthropic file operations, batch operations, assistants, and response handling:

Capability

Route type

Model example

Min version

Batches1 files/v1/batches n/a n/a

1 Batches processing for Anthropic is supported in the native format from SDK only

Anthropic base URL

The base URL is https://api.anthropic.com:443/{route_type_path}, where {route_type_path} is determined by the capability.

AI Gateway uses this URL automatically. You only need to configure a URL if you’re using a self-hosted or Anthropic-compatible endpoint, in which case set the upstream_url plugin option.

Supported native LLM formats for Anthropic

By default, the AI Proxy plugin uses OpenAI-compatible request formats. Set config.llm_format to a native format to use Anthropic-specific APIs and features.

The following native Anthropic APIs are supported:

LLM format

Supported APIs

anthropic
  • /v1/messages
  • /v1/messages/batches

Provider-specific limitations for native formats

  • Does not support llm/v1/completions or llm/v1/embeddings

Statistics logging limitations for native formats

  • No statistics logging for llm/v1/completions

Configure Anthropic with AI Proxy

To use Anthropic with AI Gateway, configure the AI Proxy or AI Proxy Advanced.

Here’s a minimal configuration for chat completions:

For more configuration options and examples, see:

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!