Are AI Meeting Assistants Safe? Privacy Risks with Cloud Transcription

14 min readPrivacy & SecurityEnglish
AI meeting assistant privacy risks and security concerns illustration

TL;DR

Are AI meeting assistants safe? Not all of them. In 2025, Otter.ai faced a class action lawsuit(opens in new tab) for recording without consent, Fireflies.ai was sued for biometric data collection(opens in new tab), and Chapman University banned Read AI(opens in new tab) over security concerns. Cloud meeting tools pose real privacy risks: data stored on third-party servers, AI training on your conversations, and recordings without proper consent. For truly safe meeting transcription, local processing (like Meetily) keeps your data on your device.

Over 62% of organizations(opens in new tab) now use AI meeting assistants. The market has exploded to $3.5 billion in 2025(opens in new tab) and is projected to reach $34 billion by 2035.

But as adoption grows, so do the lawsuits, university bans, and privacy scandals.

If you're using cloud-based meeting AI, you need to understand the risks. And if you're evaluating options, this guide will help you separate the safe tools from the problematic ones.

The Wake-Up Call: 2025's AI Meeting Assistant Lawsuits

Otter.ai Class Action (August 2025)

In August 2025, Justin Brewer filed a class action lawsuit(opens in new tab) against Otter.ai in California federal court. The allegations are serious:

"The company's AI-powered transcription service... by default does not ask meeting attendees for permission to record and fails to alert participants that recordings are shared with Otter to improve its artificial intelligence systems."

Key allegations from the complaint(opens in new tab):

The case (Brewer v. Otter.ai Inc., Case No. 5:25-cv-06911) is still pending, with Otter's response deadline extended to November 2025(opens in new tab).

California Is a Two-Party Consent State

California law requires consent from ALL parties(opens in new tab) before recording a conversation. Otter's approach of seeking consent only from the meeting host-not all participants-directly conflicts with this requirement.

Fireflies.ai Lawsuit (December 2025)

Just months later, Fireflies.ai faced its own legal challenge(opens in new tab). Illinois resident Katelin Cruz filed a complaint alleging:

"Fireflies.AI's meeting assistant records, analyzes, transcribes, and stores the unique vocal characteristics (i.e., 'voiceprints') of every meeting participant... including people who never created a Fireflies account, never agreed to its terms of service, and never gave written consent."

The lawsuit alleges violations of Illinois' Biometric Information Privacy Act (BIPA)(opens in new tab):

Chapman University Bans Read AI (August 2025)

Academic institutions are taking action too. In August 2025, Chapman University's Information Systems department issued a security notice(opens in new tab):

"After investigation, the use of Read AI is prohibited due to security, privacy, and institutional data risks."

The university cited concerns about:

  • Data leaving institutional control
  • Lack of adequate data protection
  • Security vulnerabilities in free AI tools

This wasn't an isolated decision. Organizations across healthcare, education, and government are increasingly restricting cloud-based meeting AI.

Why Cloud Meeting AI Poses Privacy Risks

The lawsuits above aren't random-they reflect systemic problems with how cloud meeting tools handle your data.

1. Your Conversations Train Their AI

Most cloud meeting tools use your transcripts to improve their models. From Otter.ai's own documentation(opens in new tab):

"Otter uses a proprietary method to de-identify user data before training our models."

Even "de-identified" data creates risks. The Otter.ai lawsuit alleges(opens in new tab) that participants weren't informed their conversations would train AI-a significant consent violation.

2. Data Stored on Third-Party Servers

When you use cloud meeting AI, your recordings live on someone else's servers:

ProviderData LocationYour Control
Otter.aiAWS (US)Limited
FirefliesCloud (US)Limited
Read AICloudLimited
MeetilyYour deviceComplete

This matters because:

A particularly concerning pattern: cloud meeting tools that automatically join meetings without explicit permission from all participants.

From user complaints about Fireflies(opens in new tab):

"By default, Fireflies joins every calendar event and shares meeting notes with all attendees... Trying to shut it up/switch it off is like trying to remove a deer tick from your leg."

This isn't just annoying-it potentially violates wiretapping laws in 12 states that require all-party consent(opens in new tab):

  • California
  • Florida
  • Illinois
  • Maryland
  • Massachusetts
  • Michigan
  • Montana
  • New Hampshire
  • Pennsylvania
  • Washington
  • Connecticut (civil liability)
  • Delaware

4. Shadow AI Is Exploding

Gartner predicts 40% of data breaches will involve "shadow AI"(opens in new tab) by 2027-unauthorized AI tools used by employees without IT approval.

Meeting AI is particularly vulnerable because:

  • Employees install browser extensions without oversight
  • Calendar integrations automatically activate
  • IT may not know which meetings are being recorded

40% of organizations have already experienced an AI-related privacy incident(opens in new tab), and 97% of breached organizations lacked adequate AI access controls(opens in new tab).

Cloud vs. Local: A Security Comparison

FeatureLocal AI (Meetily)RecommendedCloud AI (Otter, Fireflies)
Data LocationWhere recordings are storedYour device onlyThird-party servers
AI TrainingYour data used to improve modelsNeverYes (de-identified)
Consent RequiredWho must approve recordingYou decideHost only (often)
Breach ExposureRisk if provider is hackedNone (no cloud)High
Offline CapabilityWorks without internetFullNone
Compliance ControlHIPAA/GDPR by designCompleteLimited
Audit TransparencyCan verify what happens to dataOpen sourceProprietary

Who Should Avoid Cloud Meeting AI?

Based on the legal landscape and compliance requirements, these organizations should think twice about cloud-based meeting transcription:

Healthcare Organizations (HIPAA)

Meeting recordings often contain Protected Health Information (PHI). While Otter.ai achieved HIPAA compliance in July 2025(opens in new tab) and Fireflies launched "Fireflies for Healthcare"(opens in new tab), these require:

  • Enterprise plans ($25-35+/user/month)
  • Business Associate Agreements (BAAs)
  • Trust in third-party data handling

Local processing eliminates third-party risk entirely-no BAA needed when PHI never leaves your device.

Legal experts warn(opens in new tab) that allowing meeting AI vendors to access transcripts could waive attorney-client privilege:

"Allowing note-taker vendors to access or use transcripts for their own purposes could provide grounds for a waiver of privilege, which is problematic in the event of a legal dispute."

Financial Services (SOX, PCI, SEC)

Financial regulations require strict data controls. GSC data shows active searches(opens in new tab) for "ai meeting assistants data security financial" and "ai meeting assistant financial regulations."

Cloud tools create audit complexity that local processing avoids.

Education (FERPA, Institutional Policies)

Following Chapman University's lead, many institutions are implementing AI tool review processes(opens in new tab). Cloud meeting AI often fails these reviews.

European Organizations (GDPR)

Fireflies processes data on US servers(opens in new tab), creating GDPR transfer issues. The GDPR requires(opens in new tab) adequate protection for data leaving the EU-US cloud services face ongoing legal uncertainty here.

GDPR Tip: Consent Isn't Always Enough

Even with consent, GDPR's data minimization principle(opens in new tab) may conflict with meeting AI that records, stores, and analyzes everything. Local processing with user-controlled retention is the safest approach.

How to Evaluate Meeting AI Security

Before choosing a meeting assistant, ask these questions:

1. Where Does Data Go?

  • Best: Stays on your device (local processing)
  • Acceptable: Your company's cloud (self-hosted)
  • Risky: Vendor's cloud servers

2. Is It Bot-Free?

Visible meeting bots create friction and often trigger the consent issues at the heart of current lawsuits. Bot-free options:

  • Meetily - Captures system audio locally
  • Granola - Mac-native, bot-free (but cloud processing)
  • Platform-native - Zoom AI, Microsoft Copilot (no external bot)
  • Does it require consent from ALL participants?
  • Can it auto-join without explicit approval?
  • What happens when someone says "don't record"?

4. Is the Code Auditable?

Open source tools let you verify exactly what happens to your data. Proprietary tools require trust.

ToolOpen SourceAuditable
MeetilyYes (MIT)Full
Otter.aiNoNo
FirefliesNoNo
Whisper (model)YesFull

5. What's the Data Retention Policy?

The Fireflies lawsuit specifically cites(opens in new tab) lack of published retention/destruction policies as a BIPA violation.

The Local-First Alternative

If the risks above concern you, local AI processing solves them architecturally-not through policy promises, but through technical design.

How local meeting AI works:

  1. Audio captured from your device's system audio
  2. Transcription runs on your CPU/GPU using models like OpenAI Whisper(opens in new tab)
  3. Summaries generated locally (or with your chosen API)
  4. All data stays on your machine

What this means for privacy:

  • No server breach can expose your meetings (data isn't there)
  • No consent ambiguity (you control the recording)
  • No AI training on your conversations (data never leaves)
  • Complete GDPR/HIPAA compliance (no third-party processors)

Try Local Meeting AI

Meetily is free, open source, and processes everything on your device. No cloud, no bots, no compromise.

Download Free

Best Practices for Any Meeting AI

Whether you choose cloud or local, follow these guidelines:

For Organizations

  1. Create an approved tools list - Don't let employees install random meeting AI
  2. Review privacy policies - Especially data training and retention clauses
  3. Establish consent protocols - How do you notify participants?
  4. Consider jurisdiction - 12 states require all-party consent(opens in new tab)
  5. Audit regularly - What AI tools have calendar access?

For Individuals

  1. Read the privacy policy - Especially the AI training section
  2. Check auto-join settings - Disable if possible
  3. Announce recordings - Even if the tool doesn't
  4. Review data retention - Delete recordings you don't need
  5. Consider local options - For sensitive conversations

For Meetings with External Parties

  1. Always disclose AI recording - Legal and ethical requirement
  2. Offer to disable for sensitive topics - Build trust
  3. Don't rely on bot detection - Some tools don't make bots obvious
  4. Get explicit consent - Verbal isn't enough in some states

Frequently Asked Questions

Frequently Asked Questions

It depends on the tool. Cloud-based AI meeting assistants like Otter.ai, Fireflies, and Read AI have faced lawsuits and institutional bans over privacy concerns. Local AI tools like Meetily that process data on your device are architecturally safer because your data never leaves your control.
Otter.ai is currently facing a class action lawsuit (Brewer v. Otter.ai, August 2025) alleging recording without consent and using data to train AI without permission. Otter has SOC 2 certification and achieved HIPAA compliance in July 2025, but the consent model-where the bot auto-joins and seeks host consent only-remains legally contested in all-party consent states.
Fireflies.ai faces a December 2025 lawsuit alleging illegal biometric data collection under Illinois BIPA. Users have reported the bot continuing to join meetings even after account deletion. The tool processes all data on cloud servers, creating exposure if Fireflies is breached.
Chapman University banned Read AI in August 2025 citing security, privacy, and institutional data risks. The tool had concerning behaviors like auto-joining meetings and following users across platforms. Universities must protect student data under FERPA and institutional policies that cloud AI tools often violate.
The safest AI meeting assistants use local processing, meaning data never leaves your device. Meetily is the leading option-its open source (MIT license), bot-free, works offline, and keeps all data on your machine. For cloud tools, platform-native options (Microsoft Copilot, Zoom AI) avoid third-party data exposure but still use cloud processing.
Potentially. 12 US states require all-party consent before recording: California, Florida, Illinois, Maryland, Massachusetts, Michigan, Montana, New Hampshire, Pennsylvania, Washington, Delaware, and Connecticut. Cloud meeting tools that auto-join and seek only host consent may violate these laws-which is exactly what the Otter.ai lawsuit alleges.
It depends on state law and company policy. In one-party consent states, employers can record if they participate. In two-party consent states, all participants must be informed. Either way, most workplace policies require disclosure. AI meeting tools should announce their presence-but not all do clearly.
Ask if the meeting is being recorded and by what tool. Request recording be turned off for sensitive topics. In sensitive industries, advocate for local AI tools that dont upload data to third parties. Review your organizations approved tools list-shadow AI is a major breach risk.

Key Takeaways

  • 1Cloud meeting AI faces serious legal challenges: Otter.ai and Fireflies.ai both face 2025 lawsuits over consent and data practices
  • 2Chapman University banned Read AI over privacy concerns-more institutional bans are likely
  • 312 US states require all-party consent for recording, creating legal risk for auto-join meeting bots
  • 440% of organizations have experienced AI-related privacy incidents; 97% lacked adequate controls
  • 5Local AI processing (like Meetily) eliminates third-party data exposure by keeping everything on your device
  • 6For HIPAA, GDPR, and legal compliance, local processing is architecturally safer than cloud promises
  • 7Always disclose AI recording to meeting participants-its both legally and ethically required

Conclusion

The AI meeting assistant market is at an inflection point. Rapid adoption (62% of organizations) has outpaced privacy protections, leading to the lawsuits and bans of 2025.

The pattern is clear:

  • Cloud tools that auto-join meetings are legally vulnerable
  • Data used for AI training without explicit consent violates privacy expectations
  • Institutions are starting to ban tools that can't prove adequate data protection

If your meetings contain sensitive information-client discussions, health data, legal strategy, financial details-cloud meeting AI creates unnecessary risk.

Local processing isn't just a privacy preference; it's becoming a compliance requirement.

Tools like Meetily that keep data on your device solve these problems architecturally. No cloud breach can expose your meetings. No ambiguous consent model. No third-party AI training on your conversations.

The lawsuits will be decided in court. But you don't have to wait to protect your organization's most sensitive conversations.


For more on meeting privacy, see our guides on GDPR-compliant meeting transcription and HIPAA meeting requirements.

Privacy-First Meeting AI

Meetily is free, open source, and keeps your meeting data where it belongs-on your device.

Download Free

About the Author

M

Meetily Security Team

We build privacy-first AI tools for professionals who need meeting intelligence without compromising data sovereignty. Meetily is open source, HIPAA/GDPR compliant by design, and trusted by privacy-conscious organizations worldwide.

Get Started with Meetily

Meetily Pro

Advanced features for teams with priority support.

Business

Volume licensing & dedicated support for teams.

Recent Articles