Related Documentation
Made by
Kong Inc.
Supported Gateway Topologies
hybrid db-less traditional
Supported Konnect Deployments
hybrid cloud-gateways serverless
Compatible Protocols
grpc grpcs http https
Minimum Version
Kong Gateway - 3.6
Tags
#ai

The AI Prompt Template plugin lets you provide tuned AI prompts to users. Users only need to fill in the blanks with variable placeholders in the following format: {{variable}}.

This lets admins set up templates, which can then be used by anyone in the organization. It also allows admins to present an LLM as an API in its own right - for example, a bot that can provide software class examples and/or suggestions.

This plugin also sanitizes string inputs to ensure that JSON control characters are escaped, preventing arbitrary prompt injection.

This plugin extends the functionality of the AI Proxy plugin, and requires either AI Proxy or AI Proxy Advanced to be configured first. To set up AI Proxy quickly, see Get started with AI Gateway.

How it works

When activated, the template restricts LLM usage to the predefined templates. They are defined in the following format:

- name: sample-template
  template: |-
    {
      "messages": [
        {
          "role": "user",
          "content": "Explain to me what {{thing}} is."
        }
      ]
    }

When calling a template, replace the content of messages (llm/v1/chat) or prompt (llm/v1/completions) with a template reference, using the following format:

{
  "message": "{template://sample-template}",
  "properties": {
    "thing": "gravity"
  }
}

By default, requests that don’t use a template are still be passed to the LLM. However, this can be configured using the config.allow_untemplated_requests parameter. If this parameter is set to false, requests that don’t use a template will return a 400 Bad Request response.

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!