OpenAI SDK: Use an unsupported LLM modelv3.6+

Kong Gateway can attempt to support models that aren’t pre-configured with format transformers or are untested.

For this plugin to work properly, you need a Gateway Route with the following configuration:

routes:
 - name: openai-any
   paths:
     - "~/openai/(?<op_path>[^#?]+)"
   methods:
     - POST

Caution: The following use cases are unsupported but may work depending on your setup. Use at your own discretion.

When setting up for unsupported models, you must configure the route_type to preserve mode. This approach ensures that the request and response are passed through without any transformations.

For example, using the following configuration with multipart/form-data formatting, you can POST a file for transcription using the Whisper-2 transcription model:

  curl --location 'http://localhost:8000/openai/v1/audio/transcriptions' \
      --form 'model=whisper-2' \
      --form 'file=@"me_saying_hello.m4a"'

The response comes back unaltered:

  {
    "text": "Hello!"
  }

Prerequisites

  • OpenAI account

Environment variables

  • OPENAI_API_KEY: The API key used to authenticate requests to OpenAI.

Set up the plugin

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!