A new lawsuit filed in Illinois federal court is shining a spotlight on the legal risks tied to AI-powered meeting assistants that offer transcription and speaker identification services on platforms like Zoom and Microsoft Teams. The complaint, brought by Illinois resident Katelin Cruz, alleges that California-based tech company Fireflies.AI Corp. is illegally harvesting and storing individuals’ biometric voice data without their knowledge or consent.
According to Cruz’s complaint, Fireflies.AI’s meeting assistant allegedly records, analyzes, transcribes, and stores the unique vocal characteristics (i.e., “voiceprints”) of every meeting participant. This includes people who never created a Fireflies account, never agreed to its terms of service, and never gave written consent for their biometric data to be collected.
Cruz argues that voiceprints are sensitive biometric identifiers, often used to authenticate access to personal or financial information. If compromised, they pose a significant risk for identity theft and fraud.
Fireflies.AI promotes its software as having “speaker recognition” capabilities; in other words, the tech can distinguish between different people speaking in a meeting by analyzing each participant’s unique voice. The AI assistant often joins meetings automatically when enabled by the host, running in the background and attributing statements to each speaker.
According to the complaint:
- Fireflies’ privacy policy allows it to collect and process “meeting data” and “derivatives,” and store meeting recordings in its systems.
- The company says it will delete personal information only when it deems it is “no longer needed,” or if a user makes a specific request, but it does not give a clear retention timeline.
Cruz recounts that during a November virtual meeting, her voice was recorded and analyzed, even though she never consented, created an account, nor interacted with Fireflies.AI in any way. Yet the software still built a profile based on her voice and attributed statements to her in the meeting transcript.
She also points out that Fireflies’ terms of service theoretically apply only to registered users, or those who click “I Agree.” For unregistered meeting participants, no written consent is ever obtained, nor are they notified that their voiceprints might be collected.
Additionally, Cruz alleges that Fireflies.AI does not publish a policy regarding how long it retains biometric data or when it destroys old data, which is a clear requirement under Illinois’ Biometric Information Privacy Act (BIPA).
Cruz is seeking to bring her case as a class action, potentially representing anyone whose biometric speaker data was collected by Fireflies.AI without consent. She’s asking the court to award statutory damages, injunctive relief, and other costs, highlighting the broader impact on consumer privacy rights in the age of AI-driven services.
Illinois’ BIPA is among the strictest biometric privacy laws in the United States, requiring written notification and explicit consent before any company can collect, store, or use biometric data such as fingerprints, retinal scans, or voiceprints. It also mandates public policies on data retention and destruction.
This case could have far-reaching consequences not only for Fireflies.AI, but for any provider of AI meeting tools that records and processes personal voice data, especially if it does so without clear disclosure and user consent.
As remote work and virtual meetings remain the norm, companies offering smart meeting assistants will face growing scrutiny over privacy compliance. The outcome of this lawsuit may set an important precedent for protecting biometric privacy in digital workspaces. For those concerned about their own privacy during virtual meetings, it’s wise to ask meeting organizers about any third-party tools in use, and whether your data is truly protected.