OpenAI SDK: One chat route with dynamic Azure OpenAI deploymentsv3.8+

Configure a dynamic route to target multiple Azure OpenAI model deployments.

This configuration uses a dynamic URI capture to determine the deployment ID based on the incoming request path.

For this plugin to work properly, you need a Gateway Route with the following configuration:

routes:
  - name: azure-chat-model-from-path
    paths:
       - "~/openai/deployments/azure-gpt-3-5/chat/completions$"

For example, if your SDK sends requests to http://localhost:8000/openai/deployments/my-gpt-3-5/chat/completions then AI Proxy Advanced automatically maps my-gpt-3-5 as the Azure deployment ID.

This allows a single Route to support multiple Azure model deployments dynamically.

Prerequisites

  • Azure OpenAI Service account

Environment variables

  • AZURE_API_KEY: The API key used to authenticate requests to Azure OpenAI Service.

Set up the plugin

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!