Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

CFPB Warns Employers Regarding FCRA Rules for AI-Driven Worker Surveillance

By A.J. Dhaliwal, James Gatto, Mehul Madia & Maxwell Earp-Thomas on October 25, 2024
Email this postTweet this postLike this postShare this post on LinkedIn
AI-Blog-AI-Technology-660x283

On October 24, the CFPB issued Circular 2024-06, which warns companies using third-party consumer reports, particularly surveillance-based “black box” or AI algorithmic scores, that they must follow the Fair Credit Reporting Act with respect to the personal data of their workers. This guidance adds to the growing body of law that protects employees from potentially harmful use of AI.

The CFPB guidance notes employers’ growing use of third-party consumer reports in employment decisions. These reports now go beyond traditional background checks and may include monitoring employee behavior through apps or other data sources, expanding the breadth and depth of worker tracking. 

The Bureau states that background dossiers or reports that are compiled about consumers from databases collecting public records, employment history, collective-bargaining activity, reports that assess a consumer’s risk level or performance, or other information about a consumer are “consumer reports” under the FCRA. Accordingly, employers that use such reports, both during hiring and for subsequent employment purposes, must comply with FCRA obligations, including the requirement to obtain a worker’s permission to procure a consumer report, the obligation to provide notices before and upon taking adverse actions, and a prohibition on using consumer reports for purposes other than the permissible purposes described in the FCRA.

The Bureau identified some of the reports offered by background screening companies and consumer reporting agencies. It notes that some employers now use third parties to monitor workers’ sales interactions, to track workers’ driving habits, to measure the time that workers take to complete tasks, to record the number of messages workers send, and to calculate workers’ time spent off-task through documenting their web browsing, taking screenshots of computers, and measuring keystroke frequency. In some cases, this information might be sold by “consumer reporting agencies” to prospective or current employers, implicating FCRA. In addition, some companies may analyze worker data in order to provide reports containing assessments or scores of worker productivity or risk to employers.

The CFPB circular highlights several key protections FCRA provides regarding the use of third-party consumer reports by employers:

  • Consent: Employers must obtain workers’ consent before purchasing these reports, ensuring that employees are informed about the use of their personal data.
  • Transparency: Employers are required to provide detailed information to workers when reports lead to adverse actions, such as firings, denials of promotion, or demotions. More specifically, under Section 604(b), employers are required to provide notice to workers and a copy of their report before taking adverse action. 
  • Disputes: If workers dispute the information in a report, companies must correct or delete inaccurate or unverifiable data to prevent unfair penalties and set the record straight.
  • Limits: Employers can only use the reports for lawful purposes and cannot use the data for unrelated activities like marketing or selling it.

“Consumer reporting agencies” that provide reports for employment purposes are also subject to additional obligations. For instance, upon request by a worker, consumer reporting agencies must disclose the identity of any party who has used the worker’s consumer report for employment purposes in the two-year period preceding the date the request is made, which is longer than the one-year period used for other purposes.

One of the first laws to protect employees from the harmful use of AI was passed in New York City. NYC’s Automated Employment Decision Tools (AEDT) law went into effect January 1, 2023 and set new standards for employers using AI tools in making employment decisions (for more see here). Since then, other laws have been passed and many others are pending to protect employees from potential AI harms. The EEOC has issued guidance as part of its Artificial Intelligence and Algorithmic Fairness Initiative. Among other things, the EEOC published a technical assistance document on these issues. It has also engaged in enforcement against an entity that used an AI hiring tool that produced a discriminatory result.

Putting It Into Practice: While only a statement of policy, the Bureau’s circular follows on the heels of its joint hearing with the Department of Labor on worker tracking as well as the White House’s National Security Memorandum on AI, covering some of the potentially harmful effects of AI. Federal and state regulators have made no secret that protecting consumers from potentially harmful implementations of the technology is a high priority (previously discussed here, here, here, and here). Notably, the Bureau has released this circular while we patiently await its revised rulemaking on the Fair Credit Reporting Act (which is expected this fall).

Photo of A.J. Dhaliwal A.J. Dhaliwal

A.J. is an associate in the Finance and Bankruptcy Practice Group in the firm’s Washington, D.C. office.

Photo of James Gatto James Gatto

Jim Gatto is a partner in the Intellectual Property Practice Group in the firm’s Washington, D.C. office. He is Leader of the Blockchain Technology and Digital Assets Team and Social Media and Games Team. He is also Leader of the firm’s Open Source…

Jim Gatto is a partner in the Intellectual Property Practice Group in the firm’s Washington, D.C. office. He is Leader of the Blockchain Technology and Digital Assets Team and Social Media and Games Team. He is also Leader of the firm’s Open Source Team.

Read more about James Gatto
Show more Show less
Photo of Mehul Madia Mehul Madia

Mehul Madia, special counsel in the firm’s Washington, D.C. office, provides deep consumer finance and fintech expertise to clients, leveraging more than 15 years’ of public and private sector experience.

Read more about Mehul Madia
Photo of Maxwell Earp-Thomas Maxwell Earp-Thomas

Maxwell Earp-Thomas is an associate in the Corporate and Real Estate, Energy, Land Use and Environmental Practice Groups in the firm’s Orange County office.

Read more about Maxwell Earp-Thomas
  • Posted in:
    Financial
  • Blog:
    Consumer Finance and Fintech Blog
  • Organization:
    Sheppard, Mullin, Richter & Hampton LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2025, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo