A pair of Black women who claim they were unfairly denied housing in Massachusetts due to an algorithm used to screen prospective tenants are engaged in a lawsuit challenging both the company responsible for the formula and what they deemed a “discriminatory” practice that keeps Black and Hispanic applicants from securing homes.
With the case well underway in U.S. District Court in Boston, the Justice Department on Monday filed an amicus brief noting its interest in the case and arguing that the federal Fair Housing Act applies to the companies whose algorithms are used to determine if tenants qualify for housing.
“Algorithms are written by people. As such, they are susceptible to all of the biases, implicit or explicit, of the people that create them,” U.S. Attorney Rachael Rollins said in a statement. “Today’s filing recognizes that our 20th-century civil rights laws apply to 21st-century innovations.”
Plaintiffs Mary Louis, of Malden, Monica Douglas, of Canton, and a Somerville nonprofit filed the lawsuit in May 2022 against SafeRent Solutions. Both women more than half their rent using housing vouchers, according to court documents.
They claimed to have been denied rental housing due to their “SafeRent Scores,” a score calculated by SafeRent’s algorithm-based screening system. The lawsuit claimed the algorithm relies on factors that place Black and Hispanic applicants at an unfair disadvantage, including credit history and debts unrelated to past housing. The algorithm does not factor in housing vouchers, the plaintiffs said.
“SafeRent assigns disproportionately lower SafeRent Scores to Black and Hispanic rental applicants compared to white rental applicants,” their lawsuit said. It added: “No business necessity can justify that disparate impact on individuals whose source of income includes a housing voucher.”
- Read more: In push to combat racism, Massachusetts House unanimously passes bill to ban hairstyle discrimination
On Monday, Rollins’ office filed a Statement of Interest in the case, explaining that it believed the Fair Housing Act applied to companies providing residential screening services, such as SafeRent. With SafeRent moving to dismiss the case, the Justice Department said it disagreed with the company’s interpretation of the Fair Housing Act.
“The United States has a strong interest in ensuring the correct interpretation and application of the FHA’s pleading standard for disparate impact claims, including where the use of algorithms may perpetuate housing discrimination,” the Justice Department filing read.
“Tenant screening policies are not exempt from the Fair Housing Act’s protections just because decisions are made by algorithm,” Damon Smith, general counsel for the federal Department of Housing and Urban Development, said in a statement. “Housing providers and tenant screening companies must ensure that all policies that exclude people from housing opportunities, whether based on algorithm or otherwise, do not have an unjustified disparate impact because of race, national origin or another protected characteristic.”