Skip to main content

LLM Data Retention & Privacy by Provider 2026

Vendor-verified policy summaries for every AI provider Meetily can connect to via BYOK, plus the local Ollama path that keeps summaries on-device.

Each page documents the vendor's default retention, zero-data-retention (ZDR) availability, training defaults, and how Meetily routes traffic through that provider.

Providers

LocalMay 11, 2026

Ollama

Ollama runs LLMs entirely on your local machine, so there is no remote retention period, no training on your prompts, and no policy URL beyond the open-source r...

Read the Ollama summary →
CloudMay 11, 2026

Anthropic Claude

Anthropic's commercial terms commit that prompts and outputs from the Claude API are not used to train models by default. Verified policy summary plus how to ro...

Read the Anthropic Claude summary →
CloudMay 11, 2026

Azure OpenAI

Azure OpenAI commits that prompts, completions, and training data are not used to train OpenAI's or Microsoft's models, with a 30-day abuse-monitoring window an...

Read the Azure OpenAI summary →
CloudMay 11, 2026

Google Gemini

Google's paid Gemini API and Vertex AI commit not to train on customer prompts and responses. Free-tier Google AI Studio operates under different terms. Verifie...

Read the Google Gemini summary →
CloudMay 11, 2026

Groq Cloud

Groq Cloud's Data Processing Addendum commits not to train on customer information and processes API data only to provide the service. Verified policy summary p...

Read the Groq Cloud summary →
CloudMay 11, 2026

Mistral La Plateforme

Mistral La Plateforme operates from EU infrastructure and offers contractual no-training guarantees through its Data Processing Agreement. Verified policy summa...

Read the Mistral La Plateforme summary →
CloudMay 11, 2026

OpenAI

OpenAI's API does not train on customer data by default and retains inputs and outputs for a short abuse-monitoring window. Zero data retention is available for...

Read the OpenAI summary →
CloudMay 11, 2026

OpenRouter

OpenRouter is a routing layer, so retention and training behavior depend on the downstream provider you route to. Verified policy summary plus how to route Open...

Read the OpenRouter summary →

Use any of these providers in Meetily via BYOK

Meetily transcription is always 100% local. For summaries, bring your own key for any cloud provider above, or run summaries locally with Ollama and keep the entire pipeline on-device.