# Supported model providers

Below are the supported model providers categorised by cloud-based and local options, along with links to their official documentation for further reference:

## Cloud-based providers

{% hint style="warning" %}
Using these services involves transmitting data to their servers. Please review each provider’s privacy policy to understand how your data is handled.
{% endhint %}

* Anthropic — <https://docs.anthropic.com/en/home>
* Google AI Gemini — <https://ai.google.dev/gemini-api/docs>
* Mistral AI — <https://docs.mistral.ai/>
* OpenAI — <https://platform.openai.com/docs/>

## Local providers

{% hint style="info" %}
For `Local` or `Custom` models, communication uses the [standard OpenAI API schema](https://platform.openai.com/docs/api-reference/chat).
{% endhint %}

* Ollama — <https://github.com/ollama/ollama/tree/main/docs>
* Local / Custom
