The Northern District of California issued an eagerly awaited decision last month in Mobley v. Workday, Inc., where a job applicant claims that Workday’s artificial intelligence (AI) job applicant screening tools violate federal and California anti-discrimination laws. Workday moved to dismiss the claims, on the basis that it is not a covered employer under any of the applicable anti-discrimination laws, which the Court granted in part and denied in part. Specifically, the Court dismissed the plaintiff’s intentional discrimination claims and an aiding and abetting claim under California’s anti-discrimination law, but allowed disparate impact discrimination claims to proceed against Workday based on the argument that Workday was an “agent” of the employers to which the job applicant applied.
Below is a summary of the case, the legal questions at issue, the Court’s analysis, and implications for employers and vendors using automated employment decision tools.
What is the case about?
Plaintiff Derek Mobley alleges that algorithm-based applicant screening tools developed by Workday—a cloud-based software vendor that provides global human resources services to employers—discriminated against him and other job applicants based on race, age, and disability. According to his complaint, the tools screen job applicants for Workday’s customers using AI, machine learning, and pymetrics to make decisions on whether candidates move forward in the application and hiring process. Mobley, an African American man over the age of forty with anxiety and depression, claims that he applied to over 100 jobs with companies that use Workday’s screening tools and that he did not receive a single job offer.
Mobley asserted federal law claims under Title VII, the Civil Rights Act of 1866 (“Section 1981”), the Age Discrimination in Employment Act of 1967 (“ADEA”), and the Americans with Disabilities Act (“ADA”) for disparate impact and intentional discrimination. He also asserted a claim for aiding and abetting race, disability, and age discrimination under California’s Fair Employment and Housing Act (“FEHA”).
What legal issues were involved?
The foremost issue raised in the motion to dismiss was whether Workday, as a third-party software vendor for numerous employers, is subject to the anti-discrimination statues for employees of their customers. Title VII, the ADEA, and the ADA prohibit discrimination only by “employers” and “employment agencies.” The definition of “employers,” however, has also been extended to include “indirect employers” and “agents of employers.”
In January, the Court granted Workday’s first motion to dismiss Mobley’s initial complaint which was premised on an “employment agency” theory of liability. The Court found that Mobley’s initial complaint failed to show that Workday qualified as an “employment agency” subject to the anti-discrimination laws. However, the Court allowed Mobley to amend his complaint to correct the deficiency.
After Mobley’s initial complaint was dismissed with leave to amend, he filed an amended complaint alleging that Workday was liable for discrimination under three theories: as an (1) employment agency, (2) agent of employers, and (3) indirect employer. In its second motion to dismiss, Workday challenged each of these theories arguing again that, as a software vendor, it is not a covered entity under the anti-discrimination statutes except as it relates to their own employees. Workday further argued that Mobley’s amended complaint had failed to plausibly allege the prima facie elements of his disparate impact and intentional discrimination claims.
What did the Court rule?
The Court partially denied Workday’s motion to dismiss, finding that Mobley’s claims could proceed against Workday under an “agency” theory of liability and that the amended complaint plausibly alleged the necessary elements for Mobley’s disparate impact discrimination claims.
The Court partially granted Workday’s motion insofar as Mobley’s claims were based on an “employment agency” theory of liability (which differs from the “agency” theory of liability), and dismissed Mobley’s intentional discrimination claims and his aiding and abetting claim under the FEHA.
What is the difference between “employment agency” and “agency” theories of liability under Title VII, the ADEA, and the ADA?
If you’re confused by the language referenced above in the court’s ruling, you are not alone.
Employment Agency Theory
An “employment agency” refers to a traditional staffing company that procures job opportunities for employees to work with a third-party company.
As to the “employment agency theory” of liability, the Court agreed with Workday that Mobley’s claims could not proceed under that theory. Here, the Court found that the amended complaint did not plausibly allege that Workday was an “employment agency” as that term is defined in the anti-discrimination statutes at issue. Specifically, Mobley had not alleged that Workday “procured” job opportunities for employees or that Workday brought job listings to the attention of job applicants seeking employment. To the contrary, Mobley alleged that he found job postings on his own and did not begin his interactions with Workday until after he applied to a specific posting. According to the Court, Mobley’s allegation that Workday “screens” applicants using discriminatory algorithmic tools was not sufficient to allege that Workday “procures” employees (i.e., finds candidates for employers) to meet the definition of an “employment agency.” Therefore, the Court dismissed Mobley’s claims to the extent they were premised on Workday’s liability as an “employment agency.”
Agency Theory
An “agent” of an employer can be anyone who participates in the decision-making process that forms the basis of a discriminatory action – including third-party entities like Workday.
The Court held that Mobley’s claims against Workday plausibly allege an “agency theory” of liability. In its reasoning, the Court cited prior caselaw that held an employer’s agent independently liable when performing traditional employment functions (like hiring) delegated to it by the employer. Workday countered at oral arguments that such caselaw had been rejected in the Ninth Circuit but because this argument was not raised in Workday’s briefing, the Court deemed the argument waived.
Even if the argument had not been waived, the Court stated that Workday’s position was contrary to the plain language of the anti-discrimination laws at issue, which clearly apply to “any agent” of an employer. On this point, the Court rejected Workday’s argument that Mobley’s agency theory of liability rendered the “employment agency” provisions superfluous. According to the Court, liability between an employment agency and an agent is not “coextensive” because “employment agencies face a different set of restrictions than employers: employment agencies are liable when they ‘fail or refuse to refer’ individuals for consideration by employers on prohibited bases, but they are not subject to the prohibitions applicable to employers in carrying out their traditional functions, such as hiring, discharging, compensating, or promoting employees.” The Court also rejected Workday’s argument that the agency theory of liability had been foreclosed by Ninth Circuit precedent.
In sum, the Court held that “a third-party agent may be liable as an employer where the agent has been delegated functions traditionally exercised by an employer.” The Court then analyzed the factual allegations in Mobley’s amended complaint and found it to plausibly allege that Workday’s customers had delegated traditional hiring functions (like rejecting applicants) to Workday’s algorithmic decision-making tool, making Workday an “agent” of those employers. With respect to the involvement of AI in Workday’s tool, the Court noted that nothing in the anti-discrimination statutes or relevant caselaw makes a distinction between delegating functions to an automated agent versus a human. “To the contrary, courts applying the agency exception have uniformly focused on the ‘function’ that the principal has delegated to the agent, not the manner in which the agent carries out the delegated function.” Because the Court concluded that the “agency theory” of liability applied to Mobley’s claims, it did not address Mobley’s alternative argument that Workday was liable under an “indirect employer theory.”
How can an AI Vendor be liable for discrimination without intent?
When someone thinks of employment discrimination, they typically think of intentional discrimination on the basis of a protected characteristic (age, race, gender, disability, etc.). But AI vendors can be liable for discrimination – whether intentional or not – as this Mobley decision shows.
Disparate Impact Claims
One way to establish liability under federal and state employment discrimination laws is by showing that the company’s decision-making disproportionately impacts individuals in a particular demographic. Intent is not a relevant factor to this analysis.
The Court allowed Mobley’s disparate impact discrimination claims to proceed, finding that the amended complaint had pled a prima facie case. Taking Mobley’s allegations as true, the Court held that he plausibly alleged the necessary elements for disparate impact discrimination by (1) showing a significant disparate impact on a protected class or group; (2) identifying the specific employment practices or selection criteria at issue; and (3) showing a causal relationship between the challenged practices or criteria and the disparate impact. Workday challenged all three elements in its motion, which the Court rejected.
On the first element (disparate treatment), the Court found that Mobley had pled a specific employment practice by alleging that Workday’s use of algorithmic decision-making tools to screen applicants had relied on biased training data and pymetrics and personality tests, on which applicants with mental health and cognitive disorders performed more poorly. While these screening tools did vary based on customer hiring preferences, the Court held that this fact was not sufficient to dismiss Mobley’s claim because he had showed a common component that discriminated against applicants based on protected characteristics.
On the second element (disparity), the Court found a reasonable inference of disparity based on Mobley’s allegation that he had applied to over 100 job positions and had been rejected from every single one because of Workday’s allegedly biased screening tool. Although Mobley did not allege who was hired instead of him for these positions, the Court stated that this was not fatal to his claims at the pleading stage given the nature of his allegations.
For the last element (causation), the Court held that Mobley plausibly alleged that the disparity was caused by Workday’s algorithmic screening tools based, again, on the “sheer number of rejections” he received combined with allegations that Workday’s AI tool relied on biased training data. For the Court, these allegations supported a plausible inference that “Workday’s screening algorithms were automatically rejecting Mobley’s applications based on a factor other than his qualifications, such as a protected trait.” For these reasons, Mobley’s disparate impact discrimination claims were allowed to proceed in the litigation.
Intentional Discrimination Claims
Turning to Mobley’s intentional discrimination claims, the Court granted Workday’s motion to dismiss those claims. While the Court found that Mobley had shown his qualifications for the jobs from which he was rejected and the disclosure of his protected characteristics, it concluded that Mobley failed to adequately allege that Workday actually intended its screening tools to be discriminatory. According to the Court, it was insufficient for Mobley to allege that Workday was merely aware of its applicant screening tools’ discriminatory effects. He was required to show that those effects were intended by Workday which, according to the Court, his amended complaint failed to do. Therefore, the Court dismissed Mobley’s intentional discrimination claims under Title VII, the ADEA, and Section 1981.
California FEHA Aiding and Abetting Claim
Finally, the Court also granted Workday’s motion to dismiss Mobley’s aiding and abetting claim under California’s FEHA. However, the Court allowed Mobley to further amend his complaint to reassert the claim.
To plead an aiding and abetting claim under the FEHA, Mobley was required to allege that (1) his employer subjected him to discrimination; (2) the alleged aider and abettor (Workday) knew that the employer’s conduct violated the FEHA; and (3) Workday gave the employer substantial assistance or encouragement to violate the FEHA. The Court found that Mobley failed to satisfy the first two elements because he did not allege that any of the companies to which he applied discriminated against him, or that Workday knew that these employers’ conduct was discriminatory. Instead, the amended complaint focused on Workday’s alleged conduct, but it was unclear from the allegations how Workday aided and abetted another company’s discrimination.
The Court therefore dismissed Mobley’s FEHA claim but granted him leave to reassert the claim in another amended complaint. The Court directed that any reassertion of this claim “must identify the companies/employers that allegedly discriminated against Mobley, and allege specifically what the employers’ discriminatory conduct was, how Workday knew of the discriminatory conduct, and how Workday either assisted or encouraged the discriminatory conduct.”
What are the implications for employers and vendors?
Employers and vendors using automated employment decision tools should take note of this decision as it provides a blueprint for plaintiff’s attorneys to file similar claims. While the decision may be appealed and courts in other jurisdictions may resolve these issues differently, it highlights the legal risks for both employers and their vendors who use or provide AI-based tools to screen job applicants and handle other employment-related decisions. Indeed, this decision opens the door to potential lawsuits based on nothing more than a job applicant receiving a rejection notice from an employer using an AI-based applicant screening tool.
The Equal Employment Opportunity Commission (“EEOC”) is also likely to increase investigations into AI-based discrimination claims, similar to those raised by Mobley, and has released guidance to employers on how to use AI in the employment-decision-making process. While this Mobley decision does not address whether an employer can be liable under anti-discrimination laws for the algorithmic decision-making tools used by their vendors (or “agents”), the EEOC guidance makes clear that employers can be liable. The EEOC states that employers can violate Title VII if such tools result in adverse impacts on protected classes, even where those tools are designed or administered by third parties.
Given the legal risks developing around these new AI-empowered tools, employers should assess those tools they currently use along with the contract terms under which they’ve been provided. The tools should undergo regular audits or impact assessments to ensure against algorithmic discrimination. Such audits/assessments are already required in New York City and will soon be required in Colorado. Any contracts or terms of service entered into by employers with their vendors should also be closely scrutinized as they are likely to contain indemnification or liability-shifting provisions that may become crucial when defending against algorithmic discrimination claims.