Chat route with Azure OpenAI Servicev3.6+

Configure a chat route using Azure OpenAI Service with the GPT-4o model.

To connect to Azure AI, you’ll need three values from your Azure OpenAI resource:

  1. Deployment ID — The unique name of your deployed model.
    • In the Azure AI Foundry Portal sidebar, select a resource and go to: Shared Resources > Deployments > Model deployments, then click the deployment name.
    • You can also see the deployment ID in the Azure OpenAI URL when calling the API, for example: https://{AZURE_INSTANCE_NAME}.openai.azure.com/openai/deployments/{AZURE_DEPLOYMENT_ID}/...
  2. Instance name — The name of your Azure OpenAI resource.
    • This is the prefix in your API endpoint URL, for example: https://{AZURE_INSTANCE_NAME}.openai.azure.com
  3. API Key — The key used to authenticate requests to your Azure OpenAI deployment in Azure AI Foundry.
    • In the Azure AI Foundry Portal sidebar, select a resource and go to: Shared Resources > Deployments > Model deployments, then click the deployment name.
    • The API key is visible in the Endpoint tile.

Environment variables

  • AZURE_OPENAI_API_KEY

  • AZURE_INSTANCE_NAME

  • AZURE_DEPLOYMENT_ID

Set up the plugin

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!