Ollama is a local runtime, not a service
Ollama is an open-source runtime that downloads pre-trained large language models and serves them through a local HTTP endpoint on localhost. There is no Ollama-operated inference cloud, so the "data retention" framing that applies to OpenAI, Anthropic, or Google does not apply in the same shape.
When an application like Meetily calls Ollama for a summary, the request goes to a process on the same machine. No prompts or completions cross the network boundary unless the surrounding application explicitly forwards them.
Retention, training, and ZDR
- Retention: None by default. Ollama itself does not persist prompts or completions to disk beyond the running context.
- Training: Ollama does not train models. It serves models that were trained elsewhere by their respective publishers (Meta, Mistral AI, Alibaba, etc.).
- Zero data retention: Available by construction - inference is local, so off-device transmission is zero.
What still leaves your machine
Running Ollama does not automatically prevent every cloud round-trip. Things that may still touch the network include:
- The initial model download (
ollama pull <model>), which fetches weights fromregistry.ollama.aiover HTTPS. - Application-level telemetry from whatever calls Ollama (your responsibility to audit).
- Any external integrations the surrounding app makes (cloud storage backups, CRM sync, etc.).
Why this is the preferred Meetily path
Meetily's transcription path is local-by-default. Pairing it with Ollama for summaries means the entire pipeline - audio capture, transcription, summary - stays on the device. This is the simplest answer to "does the summarization step have a retention policy I need to read?" because the answer becomes "no, there is no remote service in the loop."
For organizations subject to data-residency or processor-disclosure obligations, this path removes a class of compliance questions entirely. For everyone else, it is the lowest-friction way to get a private summary without managing API keys or reading vendor policies.