Set up with Mistralv3.8+

Enable AI Semantic Caching with Mistral as your LLM and a Redis vector database.

Environment variables

  • MISTRAL_API_KEY: Your Mistral API key

Set up the plugin

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!