Another class action lawsuit—Cruz v. Fireflies.AI Corp.—puts a spotlight on potential legal risks associated with AI meeting assistants. The complaint alleges that the Fireflies tool records, analyzes, transcribes, and stores voices of meeting participants, including voices of those who are not Fireflies users, without the notice, written consent, and retention safeguards required by Illinois’ Biometric Information Privacy Act (“BIPA”). It points to Fireflies’ “speaker recognition” functionality and contends the product creates and retains voiceprints—covered under BIPA—while lacking a publicly available retention and destruction policy and failing to inform meeting participants about biometric collection. This case is a useful preview of issues regulators and plaintiffs may raise around AI transcription and summarization. We have previously covered other lawsuits involving these issues here.

Why it matters: AI note‑takers can deliver productivity gains, but they also introduce a host of potential legal and operational risks. For example:

  • More data to turn over in litigation: recordings and summaries can become discoverable;
  • Sensitive or privileged conversations may be saved and shared when they otherwise would not be;
  • AI can make mistakes (e.g., misidentify speakers or over‑simplify what was said);
  • Notice and consent rules vary, especially for outside guests or cross‑border participants;
  • Retention must align with legal holds so routine deletion is not viewed as destroying evidence;
  • If the AI tool provider uses recordings for its own purposes without the proper notice and consent, this can violate wiretapping and similar laws, such as the California Consumer Privacy Act (CCPA).

Putting it into Practice: This case is a reminder that deploying AI transcription and summarization tools requires more than turning on a feature. Think beyond generic disclosures. For example, conduct vendor due diligence, avoid using the tool in sensitive meetings, clearly notify participants and obtain consent at the start, consider human review and access restrictions where appropriate and set clear retention periods aligned with legal holds. To help ensure your employees are aware of the range of issues with AI notetakers, it is important to address this in your AI policy. For more information on these issues see “Listen Up” if Your AI Policy Does Not Cover AI Recording Issues – Another Class Action Lawsuit Filed Over Third Party AI Recording Service.” 

 If you are evaluating the use of an AI note-taker or considering refining your AI policy, reach out to your Sheppard Mullin contact to discuss practical guardrails that fit your organization.

Photo of Brittany Walter Brittany Walter

Brittany Walter is an associate in the Intellectual Property Practice Group in the firm’s San Francisco office.

Photo of James Gatto James Gatto

Jim Gatto is a partner in the Intellectual Property Practice Group in the firm’s Washington, D.C. office. He is Leader of the Blockchain Technology and Digital Assets Team and Social Media and Games Team. He is also Leader of the firm’s Open Source…

Jim Gatto is a partner in the Intellectual Property Practice Group in the firm’s Washington, D.C. office. He is Leader of the Blockchain Technology and Digital Assets Team and Social Media and Games Team. He is also Leader of the firm’s Open Source Team.