OpenAI SDK: One chat route with dynamic Azure OpenAI deploymentsv3.8+
Configure a dynamic route to target multiple Azure OpenAI model deployments.
This configuration uses a dynamic URI capture to determine the deployment ID based on the incoming request path.
For this plugin to work properly, you need a Gateway Route with the following configuration:
routes: - name: azure-chat-model-from-path paths: - "~/openai/deployments/azure-gpt-3-5/chat/completions$"
For example, if your SDK sends requests to http://localhost:8000/openai/deployments/my-gpt-3-5/chat/completions
then AI Proxy Advanced automatically maps my-gpt-3-5
as the Azure deployment ID.
This allows a single Route to support multiple Azure model deployments dynamically.
Prerequisites
- Azure OpenAI Service account
Environment variables
-
AZURE_API_KEY
: The API key used to authenticate requests to Azure OpenAI Service.
Add this section to your declarative configuration file:
_format_version: "3.0"
plugins:
- name: ai-proxy-advanced
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: ${{ env "DECK_AZURE_API_KEY" }}
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: "$(uri_captures.azure_instance)"
options:
azure_instance: my-openai-instance
azure_deployment_id: "$(uri_captures.azure_instance)"
Make the following request:
curl -i -X POST http://localhost:8001/plugins/ \
--header "Accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make the following request:
curl -X POST https://{region}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $KONNECT_TOKEN" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
region
: Geographic region where your Kong Konnect is hosted and operates. -
controlPlaneId
: Theid
of the control plane. -
KONNECT_TOKEN
: Your Personal Access Token (PAT) associated with your Konnect account.
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
echo "
apiVersion: configuration.konghq.com/v1
kind: KongClusterPlugin
metadata:
name: ai-proxy-advanced
namespace: kong
annotations:
kubernetes.io/ingress.class: kong
labels:
global: 'true'
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: '$AZURE_API_KEY'
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: '$(uri_captures.azure_instance)'
options:
azure_instance: my-openai-instance
azure_deployment_id: '$(uri_captures.azure_instance)'
plugin: ai-proxy-advanced
" | kubectl apply -f -
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "$KONNECT_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy_advanced" "my_ai_proxy_advanced" {
enabled = true
config = {
targets = [
{
route_type = "llm/v1/chat"
auth = {
header_name = "api-key"
header_value = var.azure_api_key
}
logging = {
log_statistics = true
log_payloads = false
}
model = {
provider = "azure"
name = "$(uri_captures.azure_instance)"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "$(uri_captures.azure_instance)"
}
}
} ]
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
}
This example requires the following variables to be added to your manifest. You can specify values at runtime by setting TF_VAR_name=value
.
variable "azure_api_key" {
type = string
}
Add this section to your declarative configuration file:
_format_version: "3.0"
plugins:
- name: ai-proxy-advanced
service: serviceName|Id
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: ${{ env "DECK_AZURE_API_KEY" }}
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: "$(uri_captures.azure_instance)"
options:
azure_instance: my-openai-instance
azure_deployment_id: "$(uri_captures.azure_instance)"
Make sure to replace the following placeholders with your own values:
-
serviceName|Id
: Theid
orname
of the service the plugin configuration will target.
Make the following request:
curl -i -X POST http://localhost:8001/services/{serviceName|Id}/plugins/ \
--header "Accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
serviceName|Id
: Theid
orname
of the service the plugin configuration will target.
Make the following request:
curl -X POST https://{region}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/services/{serviceId}/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $KONNECT_TOKEN" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
region
: Geographic region where your Kong Konnect is hosted and operates. -
controlPlaneId
: Theid
of the control plane. -
KONNECT_TOKEN
: Your Personal Access Token (PAT) associated with your Konnect account. -
serviceId
: Theid
of the service the plugin configuration will target.
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-advanced
namespace: kong
annotations:
kubernetes.io/ingress.class: kong
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: '$AZURE_API_KEY'
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: '$(uri_captures.azure_instance)'
options:
azure_instance: my-openai-instance
azure_deployment_id: '$(uri_captures.azure_instance)'
plugin: ai-proxy-advanced
" | kubectl apply -f -
Next, apply the KongPlugin
resource by annotating the service
resource:
kubectl annotate -n kong service SERVICE_NAME konghq.com/plugins=ai-proxy-advanced
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "$KONNECT_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy_advanced" "my_ai_proxy_advanced" {
enabled = true
config = {
targets = [
{
route_type = "llm/v1/chat"
auth = {
header_name = "api-key"
header_value = var.azure_api_key
}
logging = {
log_statistics = true
log_payloads = false
}
model = {
provider = "azure"
name = "$(uri_captures.azure_instance)"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "$(uri_captures.azure_instance)"
}
}
} ]
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
service = {
id = konnect_gateway_service.my_service.id
}
}
This example requires the following variables to be added to your manifest. You can specify values at runtime by setting TF_VAR_name=value
.
variable "azure_api_key" {
type = string
}
Add this section to your declarative configuration file:
_format_version: "3.0"
plugins:
- name: ai-proxy-advanced
route: routeName|Id
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: ${{ env "DECK_AZURE_API_KEY" }}
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: "$(uri_captures.azure_instance)"
options:
azure_instance: my-openai-instance
azure_deployment_id: "$(uri_captures.azure_instance)"
Make sure to replace the following placeholders with your own values:
-
routeName|Id
: Theid
orname
of the route the plugin configuration will target.
Make the following request:
curl -i -X POST http://localhost:8001/routes/{routeName|Id}/plugins/ \
--header "Accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
routeName|Id
: Theid
orname
of the route the plugin configuration will target.
Make the following request:
curl -X POST https://{region}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/routes/{routeId}/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $KONNECT_TOKEN" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
region
: Geographic region where your Kong Konnect is hosted and operates. -
controlPlaneId
: Theid
of the control plane. -
KONNECT_TOKEN
: Your Personal Access Token (PAT) associated with your Konnect account. -
routeId
: Theid
of the route the plugin configuration will target.
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-advanced
namespace: kong
annotations:
kubernetes.io/ingress.class: kong
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: '$AZURE_API_KEY'
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: '$(uri_captures.azure_instance)'
options:
azure_instance: my-openai-instance
azure_deployment_id: '$(uri_captures.azure_instance)'
plugin: ai-proxy-advanced
" | kubectl apply -f -
Next, apply the KongPlugin
resource by annotating the httproute
or ingress
resource:
kubectl annotate -n kong httproute konghq.com/plugins=ai-proxy-advanced
kubectl annotate -n kong ingress konghq.com/plugins=ai-proxy-advanced
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "$KONNECT_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy_advanced" "my_ai_proxy_advanced" {
enabled = true
config = {
targets = [
{
route_type = "llm/v1/chat"
auth = {
header_name = "api-key"
header_value = var.azure_api_key
}
logging = {
log_statistics = true
log_payloads = false
}
model = {
provider = "azure"
name = "$(uri_captures.azure_instance)"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "$(uri_captures.azure_instance)"
}
}
} ]
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
route = {
id = konnect_gateway_route.my_route.id
}
}
This example requires the following variables to be added to your manifest. You can specify values at runtime by setting TF_VAR_name=value
.
variable "azure_api_key" {
type = string
}
Add this section to your declarative configuration file:
_format_version: "3.0"
plugins:
- name: ai-proxy-advanced
consumer: consumerName|Id
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: ${{ env "DECK_AZURE_API_KEY" }}
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: "$(uri_captures.azure_instance)"
options:
azure_instance: my-openai-instance
azure_deployment_id: "$(uri_captures.azure_instance)"
Make sure to replace the following placeholders with your own values:
-
consumerName|Id
: Theid
orname
of the consumer the plugin configuration will target.
Make the following request:
curl -i -X POST http://localhost:8001/consumers/{consumerName|Id}/plugins/ \
--header "Accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
consumerName|Id
: Theid
orname
of the consumer the plugin configuration will target.
Make the following request:
curl -X POST https://{region}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/consumers/{consumerId}/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $KONNECT_TOKEN" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
region
: Geographic region where your Kong Konnect is hosted and operates. -
controlPlaneId
: Theid
of the control plane. -
KONNECT_TOKEN
: Your Personal Access Token (PAT) associated with your Konnect account. -
consumerId
: Theid
of the consumer the plugin configuration will target.
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-advanced
namespace: kong
annotations:
kubernetes.io/ingress.class: kong
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: '$AZURE_API_KEY'
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: '$(uri_captures.azure_instance)'
options:
azure_instance: my-openai-instance
azure_deployment_id: '$(uri_captures.azure_instance)'
plugin: ai-proxy-advanced
" | kubectl apply -f -
Next, apply the KongPlugin
resource by annotating the KongConsumer
resource:
kubectl annotate -n kong CONSUMER_NAME konghq.com/plugins=ai-proxy-advanced
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "$KONNECT_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy_advanced" "my_ai_proxy_advanced" {
enabled = true
config = {
targets = [
{
route_type = "llm/v1/chat"
auth = {
header_name = "api-key"
header_value = var.azure_api_key
}
logging = {
log_statistics = true
log_payloads = false
}
model = {
provider = "azure"
name = "$(uri_captures.azure_instance)"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "$(uri_captures.azure_instance)"
}
}
} ]
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
consumer = {
id = konnect_gateway_consumer.my_consumer.id
}
}
This example requires the following variables to be added to your manifest. You can specify values at runtime by setting TF_VAR_name=value
.
variable "azure_api_key" {
type = string
}
Add this section to your declarative configuration file:
_format_version: "3.0"
plugins:
- name: ai-proxy-advanced
consumer_group: consumerGroupName|Id
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: ${{ env "DECK_AZURE_API_KEY" }}
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: "$(uri_captures.azure_instance)"
options:
azure_instance: my-openai-instance
azure_deployment_id: "$(uri_captures.azure_instance)"
Make sure to replace the following placeholders with your own values:
-
consumerGroupName|Id
: Theid
orname
of the consumer group the plugin configuration will target.
Make the following request:
curl -i -X POST http://localhost:8001/consumer_groups/{consumerGroupName|Id}/plugins/ \
--header "Accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
consumerGroupName|Id
: Theid
orname
of the consumer group the plugin configuration will target.
Make the following request:
curl -X POST https://{region}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/consumer_groups/{consumerGroupId}/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $KONNECT_TOKEN" \
--data '
{
"name": "ai-proxy-advanced",
"config": {
"targets": [
{
"route_type": "llm/v1/chat",
"auth": {
"header_name": "api-key",
"header_value": "'$AZURE_API_KEY'"
},
"logging": {
"log_statistics": true,
"log_payloads": false
},
"model": {
"provider": "azure",
"name": "$(uri_captures.azure_instance)",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "$(uri_captures.azure_instance)"
}
}
}
]
}
}
'
Make sure to replace the following placeholders with your own values:
-
region
: Geographic region where your Kong Konnect is hosted and operates. -
controlPlaneId
: Theid
of the control plane. -
KONNECT_TOKEN
: Your Personal Access Token (PAT) associated with your Konnect account. -
consumerGroupId
: Theid
of the consumer group the plugin configuration will target.
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-advanced
namespace: kong
annotations:
kubernetes.io/ingress.class: kong
config:
targets:
- route_type: llm/v1/chat
auth:
header_name: api-key
header_value: '$AZURE_API_KEY'
logging:
log_statistics: true
log_payloads: false
model:
provider: azure
name: '$(uri_captures.azure_instance)'
options:
azure_instance: my-openai-instance
azure_deployment_id: '$(uri_captures.azure_instance)'
plugin: ai-proxy-advanced
" | kubectl apply -f -
Next, apply the KongPlugin
resource by annotating the KongConsumerGroup
resource:
kubectl annotate -n kong CONSUMERGROUP_NAME konghq.com/plugins=ai-proxy-advanced
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "$KONNECT_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy_advanced" "my_ai_proxy_advanced" {
enabled = true
config = {
targets = [
{
route_type = "llm/v1/chat"
auth = {
header_name = "api-key"
header_value = var.azure_api_key
}
logging = {
log_statistics = true
log_payloads = false
}
model = {
provider = "azure"
name = "$(uri_captures.azure_instance)"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "$(uri_captures.azure_instance)"
}
}
} ]
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
consumer_group = {
id = konnect_gateway_consumer_group.my_consumer_group.id
}
}
This example requires the following variables to be added to your manifest. You can specify values at runtime by setting TF_VAR_name=value
.
variable "azure_api_key" {
type = string
}