5 Takeaways from the Privacy Center’s Community Teach-in on Algorithmic Housing Discrimination

Troy and Monique Murphy, a Black couple, resided in the Navy Yard neighborhood of DC for 7 years. After 6 years of passing the annual recertification required to live in their affordable housing unit, their recertification suddenly was denied without explanation, despite there having been no meaningful changes in their lives. A computer algorithm, operating in a discriminatory manner, may have been responsible. Tenants throughout DC may be experiencing similar algorithmic discrimination as you read this.

On December 15, 2023, the Center on Privacy & Technology at Georgetown Law hosted a virtual teach-in on algorithmic housing discrimination in DC. The event gathered residents, researchers, advocates, and community leaders to learn together about the issue. It also featured a panel, moderated by the Privacy Center’s Senior Associate Cynthia Khoo, that included the Murphys; Upturn’s Project Director Natasha Duarte; Equal Rights Center’s (ERC’s) Senior Fair Housing Rights Manager Susan (“Susie”) McClannahan; and Georgetown University Law Center 3L, Wanqian (“Wan”) Zhang.

Here are 5 key takeaways from the event.

1. Algorithms Used for Housing Decisions Can Reinforce Historical Inequities at Scale.

At the most basic level, an algorithm is a set of instructions. It is like a recipe, where the outcome changes depending on what you put in. When baking a cake, instructions can be helpful. But “when the ingredients are things like systemic racism and historical redlining,” Khoo explained at the event, the result is what is known as algorithmic discrimination.

Today, algorithms are used throughout the rental housing industry. For example, landlords use algorithmic tools marketed by third-party companies to decide whether to accept or reject a rental application. These computer algorithms often rely on information that reflects historical inequities, such as applicants’ rental and eviction histories, criminal records, and credit reports. Given that Black and low-income individuals are disproportionately targeted for wrongful eviction filings and police surveillance, and since historical redlining has led these communities to be the most unbanked or underbanked in the US, these applicants are more likely to be flagged as unfavorable tenants and denied housing opportunities. Thus, computer algorithms have the potential to replicate those inequities exponentially.

2. Third-Party Computer Algorithms “Remove the Face” of the Entity Making Critical Decisions About Housing.

When the Murphys were denied recertification, they did not know who handled the recertification process. It was not until November 1, 2023, that they learned about the DC Attorney General’s lawsuit against RealPage, Inc., a multinational corporation that sells its algorithmic tools to landlords throughout the DC Metropolitan area. The Murphys later discovered that RealPage was the third-party company that processed their recertification the year they were denied.

Had the lawsuit never happened, the Murphys likely would not have known about RealPage, the true entity deciding their recertification. As Troy said, the use of third-party computer algorithms in housing decisions “remov[es] the face” of the decision-maker.

3. Outsourcing Housing Decisions to Third-Party Computer Algorithms Raises Data Privacy Concerns

As part of their annual recertification, the Murphys were required to submit various documents containing sensitive personal information, such as their social security numbers, to the recertification portal. They did not know about RealPage at the time, which means that they needed to turn over to the company data about their family without knowledge of meaningful consent. The Murphys expressed fear of unauthorized access to their data that may negatively impact their children’s ability to access important life opportunities in the future.

4. Third-Party Computer Algorithms Allow Landlords to Evade Responsibility and Avoid Accountability.

Panelist Susie McClannahan shared how landlords’ use of algorithms makes it more difficult to ensure fairness in renting. The ERC is a civil rights organization that identifies and seeks to eliminate unlawful and unfair discrimination in housing, employment, and public accommodations by engaging in civil rights testing. In the past year and a half, McClannahan shared at the event, the ERC received more than 100 requests for assistance involving claims of illegal tenant screening algorithms. Clients whose rental applications were denied received notices stating their “Rental history does not meet property requirements” or “Credit record did not meet requirements.” This extremely vague language made it difficult to determine what leasing companies were screening for and what resulted in the denial of an application.

Landlords that use third-party algorithms to make these denials are able to avoid the responsibility of meaningfully reviewing rental applications. When ERC’s civil rights testers contacted apartment buildings asking what they were screening for using third-party algorithms, leasing agents stated that they were unaware of how the decisions were made. This makes it harder to establish a legal case when a person believes that they have been wrongfully denied housing.

5. Existing Legal Protections to Address Algorithmic Housing Discrimination Are Limited.

Federal Law

There are some legal tools at the federal level to address algorithmic housing discrimination. Panelist Natasha Duarte shared that some federal agencies, like the Federal Trade Commission (FTC) and Consumer Finance Protection Bureau (CFPB), have been investigating, writing reports, and taking enforcement action on the issue of tenant screening and other forms of algorithmic discrimination. If you believe that a company may be engaging in discrimination, you can file a complaint with the FTC. If you see something wrong with your tenant screening report (which may include inaccurate credit report data), you can file a complaint with the CFPB.

Still, federal law leaves much to be desired in addressing this problem.

State Law

In the absence of comprehensive federal regulation, advocates have turned to state and local law to fill those gaps. Currently, there are a few existing state laws that address algorithmic decision-making, but there aren’t many that specifically tackle algorithmic discrimination. However, there is a growing effort to enact new laws that do just that. Panelist Wan Zhang outlined four key features of these proposed bills:

a. SCOPE. Some bills cover only one industry, like housing or employment, while others are broader, covering multiple sectors where people may be subjected to unfair algorithms. Examples include a bill from New York that seeks to address algorithmic discrimination in housing and another from New Jersey that targets algorithmic discrimination in the employment context. A bill currently pending before the DC Council, the Stop Discrimination by Algorithms Act (SDAA), more comprehensively targets the use of biased algorithmic decision-making across sectors, not only in both employment and housing, but also in education, credit access, and other key areas of life.

b. TESTING. Many bills require large landlords and corporate employers to regularly test their algorithms for unfair results for historically marginalized groups. Some of them, like the SDAA and one in California, exempt small landlords and business owners from these testing requirements.

c. TRANSPARENCY. Some bills require that the test results be made publicly available or otherwise reported. The purpose of such a provision is to inform the public about the use of algorithms and their impact on consumers.

d. ENFORCEMENT. Some bills authorize the government to investigate algorithmic discrimination and bring legal action. The above-mentioned New York bill and SDAA also give everyday people the right to sue.

To be clear, these characteristics have been seen in proposed bills, not existing law. If you’re interested in seeing any of these bills become the law in your state, contact your local representative. For example, DC residents can contact the DC Council to share their thoughts on the SDAA.

--

--