Skip to main content
CloudLast verified May 11, 2026

Azure OpenAI data retention policy

Azure OpenAI commits that prompts, completions, and training data are not used to train OpenAI's or Microsoft's models, with a 30-day abuse-monitoring window and modified-abuse-monitoring eligibility for sensitive workloads. Verified policy summary plus how to route Azure OpenAI through Meetily as a BYOK summary provider.

Quick policy snapshot

Default retention
30 days (default)
Zero data retention available
Yes
Trains on API customer data by default
No

Azure OpenAI's training default

Microsoft's Azure OpenAI data privacy documentation is unusually clear-cut for a cloud LLM service. Customer prompts, completions, embeddings, and uploaded training data are:

  • Not available to other Azure customers.
  • Not available to OpenAI or other Azure Direct Model providers.
  • Not used by model providers to improve their models or services.
  • Not used to train any generative AI foundation models without your permission or instruction.
  • Not used to improve Microsoft or third-party products or services without your permission or instruction.

Fine-tuned Azure OpenAI models are available exclusively for the customer that created them. The inference models are stateless, so prompts and completions are not stored in the model.

When Meetily users select an Azure OpenAI deployment as their summary provider via BYOK, the request hits your Azure-hosted endpoint under your Azure subscription. The defaults on this page apply.

Retention

Azure OpenAI runs an abuse-monitoring system that may sample a subset of prompts and completions for review when its automated systems flag potentially abusive content. Sampled data is stored in a per-resource abuse-monitoring data store, isolated by customer, with a 30-day retention window. Human reviewers (authorized Microsoft employees, located in the EEA for EEA-deployed resources) access this data only under strict request-ID-based queries with just-in-time approval.

Stateful features (Responses API, Assistants Threads, Stored completions, file uploads, fine-tuning) carry their own retention semantics that you opt into when you use those features. Data stored for those features lives at rest in the Foundry resource in your Azure tenant, in the same geography as the resource, encrypted with AES-256 and optionally with a customer-managed key.

Zero data retention (modified abuse monitoring)

Azure OpenAI offers a documented path to disable abuse-monitoring data storage entirely, called modified abuse monitoring. Eligible customers apply through a Microsoft form, and approval is granted for sensitive use cases where the 30-day monitoring store would be incompatible with the customer's compliance posture.

After approval:

  • Prompts and completions are not stored for human review.
  • Automated review may still run at request time without storing the data.
  • You can verify the off state via the Azure portal JSON view or Azure CLI: the ContentLogging capability appears as false only when monitoring storage is disabled.

This is one of the most explicit and customer-verifiable ZDR paths in the cloud LLM market.

Geographic processing

Azure OpenAI offers three deployment types with distinct processing-location semantics:

  • Standard: Prompts and completions processed within the customer-specified geography.
  • DataZone (US or EU): Processed within the named data zone; data at rest stays in the customer-designated geography.
  • Global: Processed in any geography where the model is deployed; data at rest stays in the customer-designated geography.

For EU customers needing strict EU residency, Standard or DataZone EU deployments keep both processing and storage within Europe, and reviewers for any abuse-monitoring traffic are EEA-based.

How Meetily uses Azure OpenAI

Meetily routes Azure OpenAI traffic through your own API key against your Azure-hosted endpoint. Your subscription's deployment type, region, and abuse-monitoring configuration apply. The transcript text is sent over TLS to your Azure OpenAI endpoint for summarization, and the response is returned to Meetily and stored locally on your device. Audio is never transmitted to Azure at any point.

For Meetily users in regulated industries (healthcare, legal, financial services), the combination of Azure OpenAI's modified abuse monitoring plus Meetily's local transcription is one of the strongest cloud-summary paths available without going fully local. For zero retention by construction, switch to local Ollama and keep the entire pipeline on-device.

Last verified: May 11, 2026 . Policy source: Azure OpenAI policy

Frequently asked questions

Does Azure OpenAI use my prompts or completions to train models?
No. Microsoft's data privacy documentation for Azure OpenAI (Azure Direct Models in Microsoft Foundry) commits that prompts, completions, embeddings, and training data are not available to OpenAI or other model providers and are not used to train any foundation models without your explicit permission or instruction. Customer Data is also not used to improve Microsoft or third-party products without your permission.
What is the default retention for Azure OpenAI traffic?
Prompts and completions are not stored by the inference model itself (the models are stateless). For abuse monitoring, Microsoft retains a sampled subset of prompts and completions for up to 30 days in an abuse-monitoring data store, accessible only by authorized Microsoft reviewers under strict access controls. Customers approved for modified abuse monitoring do not have data stored in this monitoring path.
Is zero data retention available for Azure OpenAI?
Yes, via the modified abuse monitoring program. Eligible customers can apply to disable abuse-monitoring data storage entirely, which removes the 30-day retention window. Approval is gated by Microsoft's review process for sensitive use cases. Customers can verify the off state via the Azure portal JSON view or Azure CLI, looking for ContentLogging set to false.
Where is Azure OpenAI data stored geographically?
Azure OpenAI processes data in the customer-specified geography by default for Standard deployments. Global and DataZone deployment types broaden the processing region (Global: any geography where the model is deployed; DataZone US or EU: any geography within that data zone). Data stored at rest, including the abuse-monitoring store, remains in the customer-designated geography even for Global and DataZone deployments. EU-deployed resources keep human reviewers within the European Economic Area.
Does Azure OpenAI support HIPAA, SOC 2, and ISO 27001?
Yes. Azure OpenAI falls under the broader Microsoft Azure compliance umbrella. HIPAA BAA, SOC 2 Type 2, ISO 27001, FedRAMP, and other certifications are available; consult learn.microsoft.com/azure/compliance for the current scope and any service-specific exclusions.
How do I request modified abuse monitoring (ZDR)?
Microsoft publishes an application form linked from the abuse monitoring documentation. Approval is required for use cases involving sensitive or regulated data where the abuse-monitoring data store would be incompatible with your compliance posture. After approval, you can verify the off state via the Azure portal.
How does Meetily handle Azure OpenAI when I pick it as my summary provider?
Meetily transcription is always 100% local. When you select an Azure OpenAI deployment as your summary provider via BYOK, Meetily routes transcript text (not audio) to your Azure OpenAI endpoint using your own API key. The retention and training defaults on this page apply to your Azure subscription.
Why is Azure OpenAI a strong cloud option for enterprise Meetily users?
Three reasons: (1) explicit no-training default backed by Microsoft's Data Protection Addendum, (2) region pinning for data residency, (3) modified abuse monitoring to disable the 30-day retention window for approved sensitive workloads. Pair with Meetily's local transcription for a local-then-cloud pipeline where only the summary step touches Azure.

Use Azure OpenAI with Meetily, on your terms

Meetily transcription stays 100% local. For summaries, bring your own Azure OpenAI key (BYOK) so the data path matches the policy you just read - or pick a local model if you want zero retention by construction.