Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

Lawsuit Alleges Fireflies.AI Corp. Illegally Collects Biometric Data from Virtual Meetings

By Kathryn Rattigan on December 24, 2025
Email this postTweet this postLike this postShare this post on LinkedIn

A new lawsuit filed in Illinois federal court is shining a spotlight on the legal risks tied to AI-powered meeting assistants that offer transcription and speaker identification services on platforms like Zoom and Microsoft Teams. The complaint, brought by Illinois resident Katelin Cruz, alleges that California-based tech company Fireflies.AI Corp. is illegally harvesting and storing individuals’ biometric voice data without their knowledge or consent.

According to Cruz’s complaint, Fireflies.AI’s meeting assistant allegedly records, analyzes, transcribes, and stores the unique vocal characteristics (i.e., “voiceprints”) of every meeting participant. This includes people who never created a Fireflies account, never agreed to its terms of service, and never gave written consent for their biometric data to be collected.

Cruz argues that voiceprints are sensitive biometric identifiers, often used to authenticate access to personal or financial information. If compromised, they pose a significant risk for identity theft and fraud.

Fireflies.AI promotes its software as having “speaker recognition” capabilities; in other words, the tech can distinguish between different people speaking in a meeting by analyzing each participant’s unique voice. The AI assistant often joins meetings automatically when enabled by the host, running in the background and attributing statements to each speaker.

According to the complaint:

  • Fireflies’ privacy policy allows it to collect and process “meeting data” and “derivatives,” and store meeting recordings in its systems.
  • The company says it will delete personal information only when it deems it is “no longer needed,” or if a user makes a specific request, but it does not give a clear retention timeline.

Cruz recounts that during a November virtual meeting, her voice was recorded and analyzed, even though she never consented, created an account, nor interacted with Fireflies.AI in any way. Yet the software still built a profile based on her voice and attributed statements to her in the meeting transcript.

She also points out that Fireflies’ terms of service theoretically apply only to registered users, or those who click “I Agree.” For unregistered meeting participants, no written consent is ever obtained, nor are they notified that their voiceprints might be collected.

Additionally, Cruz alleges that Fireflies.AI does not publish a policy regarding how long it retains biometric data or when it destroys old data, which is a clear requirement under Illinois’ Biometric Information Privacy Act (BIPA).

Cruz is seeking to bring her case as a class action, potentially representing anyone whose biometric speaker data was collected by Fireflies.AI without consent. She’s asking the court to award statutory damages, injunctive relief, and other costs, highlighting the broader impact on consumer privacy rights in the age of AI-driven services.

Illinois’ BIPA is among the strictest biometric privacy laws in the United States, requiring written notification and explicit consent before any company can collect, store, or use biometric data such as fingerprints, retinal scans, or voiceprints. It also mandates public policies on data retention and destruction.

This case could have far-reaching consequences not only for Fireflies.AI, but for any provider of AI meeting tools that records and processes personal voice data, especially if it does so without clear disclosure and user consent.

As remote work and virtual meetings remain the norm, companies offering smart meeting assistants will face growing scrutiny over privacy compliance. The outcome of this lawsuit may set an important precedent for protecting biometric privacy in digital workspaces. For those concerned about their own privacy during virtual meetings, it’s wise to ask meeting organizers about any third-party tools in use, and whether your data is truly protected.

Photo of Kathryn Rattigan Kathryn Rattigan

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy and Security Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and…

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy and Security Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security compliance. Kathryn helps clients review, revise and implement necessary policies and procedures under the Health Insurance Portability and Accountability Act (HIPAA). She also provides clients with the information needed to effectively and efficiently handle potential and confirmed data breaches while providing insight into federal regulations and requirements for notification and an assessment under state breach notification laws. Prior to joining the firm, Kathryn was an associate at Nixon Peabody. She earned her J.D., cum laude, from Roger Williams University School of Law and her B.A., magna cum laude, from Stonehill College. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.

Read more about Kathryn Rattigan
Show more Show less
  • Posted in:
    Intellectual Property
  • Blog:
    Data Privacy + Cybersecurity Insider
  • Organization:
    Robinson & Cole LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo