While federal regulatory agencies retreat from enforcing disparate impact discrimination, at least one state agency has stepped forward. Massachusetts Attorney General Andrea Joy Campbell announced on July 10, 2025 a settlement with a student loan company, resolving allegations that the company’s artificial intelligence (“AI”) underwriting models resulted in unlawful disparate impact based on race and immigration status.
The disparate impact theory of discrimination in the lending context has been controversial. It has been 10 years since the Supreme Court held in Inclusive Communities that disparate impact is available under the Fair Housing Act if a plaintiff points to a policy or policies of the defendant that caused the disparity. In the fair lending context, then, disparate impact applies to mortgage loans. However, for other types of consumer credit – like auto loans or student loans – a plaintiff or government enforcer claiming discrimination would need to rely on the Equal Credit Opportunity Act (“ECOA”). While ECOA prohibits discrimination against an applicant with respect to any aspect of a credit transaction, there has been much debate over whether it applies to discrimination in the form of disparate impact. The federal government for years relied heavily on ECOA to bring credit discrimination actions. The Biden Administration pursued a vigorous redlining initiative against mortgage lenders. The government used the vast amount of data obtained under the Home Mortgage Disclosure Act (“HMDA”) and compared the activities of various lenders within a geographic area to determine whether a lender was significantly lagging its peers in making loans to certain protected groups. The government then looked to the lender’s branch locations, advertising strategies, the racial/ethnic make-up of its loan officers, and other factors to assert that the lender had discouraged loan applicants from protected classes. Through that redlining initiative, the government settled dozens of cases, resulting in well over $100 million in payments.
HMDA data provides extensive, if imperfect, demographic data on mortgage lending activities and has been key to building claims of lending discrimination, particularly disparate impact. However, that level of data is not generally available for other types of lending, like student loans. Without such data, the Office of the Massachusetts Attorney General (“OAG”) in this case reviewed the lender’s algorithmic rules, its use of judgmental discretion in the loan approval process, and internal communications, which the Attorney General described as exhibiting bias.
Disparate Impact Based on Race, National Origin
In that review, the OAG looked back to the scoring model the lender used prior to 2017, which relied in part on a Cohort Default Rate – the average rate of loan defaults associated with specific higher education institutions. The OAG asserted that use of that factor in its underwriting model resulted in disparate impact in approval rates and loan terms, disfavoring Black and Hispanic applicants in violation of ECOA and the state’s prohibition against unfair or deceptive acts or practices (“UDAP”). The public settlement order did not provide the level of statistical disparities. In addition, until 2023, the OAG asserted that the lender also included immigration status in its algorithm, knocking out applicants who lacked a green card. That factor “created a risk of a disparate outcome against applicants on the basis of national origin,” and as such violated ECOA and UDAP according to the OAG. The settlement order prohibits the lender from using the Cohort Default Rate or the knock-out rule for applicants without a green card (although it appears the lender had discontinued those considerations years ago).