Poverty Lawgorithms

Six Principles to Enhance the Digital Literacy of Civil Legal Services Lawyers

Michele Gilman
Data & Society: Points
5 min readSep 2, 2020

--

This blog post draws from Michele Gilman’s Data & Society report Poverty Lawgorithms that familiarizes fellow poverty and civil legal services lawyers with the ins and outs of data-centric and automated-decision making systems, so that they can clearly understand the sources of the problems their clients are facing and effectively advocate on their behalf.

Illustration: file folders in different colors, labeled 1–6
Illustration by Yichi Liu

The pandemic is revealing and magnifying long-standing inequalities in American society. For civil legal services lawyers, or poverty lawyers, this means our clients are facing an even greater cascade of challenges, including lost employment, precarious housing, hunger, and poor health. As we work to meet our clients’ urgent needs, it is essential to recognize and combat the data-centric technologies that exacerbate our clients’ suffering — technologies that will extend the pandemic’s harms to our clients long after the public health emergency ends.

As we work to meet our clients’ urgent needs, it is essential to recognize and combat the data-centric technologies that exacerbate our clients’ suffering…

Consider the situation facing Brenda, a hypothetical client. Brenda lost her job as a restaurant cook. Months later, she is still waiting for unemployment benefits due to her state’s antiquated processing system. The federal and state eviction moratoriums will eventually expire, and without further government action, the accumulated rent will become due. Brenda lacks the money to pay those mounting bills. Her landlord has already threatened eviction.

Once that eviction case gets filed, it will live forever as a negative mark against Brenda, regardless of the outcome. (Perhaps she will prevail, or perhaps she will lose when she is unable to participate in the court’s remote hearing due to her lack of internet access.) Future landlords will easily find the eviction filing in online court records or within tenant screening reports, compiled by companies that profit from digitally profiling potential tenants and selling the data — often erroneous — to landlords. In turn, an eviction case record will reduce Brenda’s ability to secure future housing.

When the landlord secures a court judgment for unpaid rent, Brenda’s credit report will be damaged, impacting her ability to secure future loans to weather the economic crisis. The judgment will also be digitally embedded in employment reports that employers use to sort through job applicants.

Brenda’s eviction is just one of many data points—including her employment and credit histories, social media usage, online browsing history, shopping patterns, and public records — that feed her digital profiles. In turn, these digital profiles increasingly serve as gatekeepers to housing, employment, education, and other life necessities; and allow predatory financial and educational institutions to target low-income people, leading to a downward financial spiral of ever-increasing debt.

we must…understand the data-centric technologies shaping the lives and opportunities of our clients.

Poverty lawyers are educated in the law. Yet, to meet our ethical duty of competency, we must also understand the data-centric technologies shaping the lives and opportunities of our clients. Here are six overarching principles to enhance the digital literacy of poverty lawyers:

First, it is important to recognize that digital technologies are impacting legal services clients whether or not we are aware of them. Many automated decision-making systems operate invisibly and are adopted without opportunities for public input, hindering algorithmic accountability. Thus, poverty lawyers made need to take affirmative steps to identify and understand these systems, which both government agencies and commercial entities deploy, sometimes in concert. While the legal community is increasingly acquainted with algorithmic risk assessments in the criminal justice system, there is less awareness among civil legal services lawyers that similar tools are spreading to domains where we regularly practice, such as housing, consumer, family, education, workplace, and public benefits law.

Second, lawyers need to understand that computers are not magic. It is tempting to put faith in computers given the biases and errors associated with human decision-making, especially where marginalized people are concerned. In contrast to humans, computers appear objective and accurate. However, the reality is that people program computers, and in the process, import human biases and errors into computer models — thus, the computer science saying: “garbage in, garbage out.”

Mistakes can arise when programmers inaccurately translate regulatory requirements into source code — an outcome that will be of no surprise to lawyers who regularly wrestle with complex and ambiguous regulations in fields such as public benefits or tax law. Further, data is not neutral. Data sets reflect structural inequities. Baked-in biases along class, race, and gender lines are endemic in the data that gets fed into algorithmic systems, thereby yielding unequal outcomes.

Third, despite the fallibility of algorithms, lawyers may confront judges, juries, and other decision-makers inflicted with automation bias. This is a psychological phenomenon in which people defer to computer generated outputs as more reliable than their own judgment. Poverty lawyers need the tools to understand why algorithms can be incorrect, to access and interrogate algorithmic models, and to explain to judges and other decision-makers why they should not blindly trust an algorithmic determination.

Fourth, in challenging an algorithm in adversarial proceedings, poverty lawyers are likely to face claims that the algorithm is proprietary. Private businesses claim trade secrecy protection for their algorithms. Further, many government agencies that purchase algorithmic software from private vendors do so under contractual, non-disclosure agreements. Nevertheless, in the civil context, courts have ruled that procedural due process interests prevail over trade secrecy claims. This is an evolving area of the law with opportunities for creative lawyers to enhance transparency through litigation of freedom of information laws, attention to government procurement processes, and advocacy for algorithmic accountability statutes.

Fifth, poverty lawyers do not need a technical background or the ability to code software in order to effectively advocate in cases involving data-centric technologies. These issues are no more complex than other interdisciplinary areas that poverty lawyers regularly have to master to effectively represent clients. There are many available resources (including those linked in the published report) and experienced lawyers who have already challenged algorithmic systems. Moreover, by gaining the skills to issue spot digital privacy issues, lawyers are well-situated to connect with technical experts and data justice advocates for analysis and support.

Sixth, and finally, we cannot forget that there are many benefits that technology is bringing to low-income people, legal services practices, and the justice system. Internet access gives low-income people the ability to apply with ease for jobs or social services and to connect with social justice movements. Anti-poverty advocates have developed tech tools to empower low-income communities, such as smartphone apps that keep track of hours worked for purposes of wage claims or that automatically complete expungement petitions. They have also analyzed open data, such as eviction data, to understand outcomes and advocate for systemic change. Continued innovations may well assist in combatting the harms associated with digital profiling and surveillance tools.

By enhancing our understandings of the ways in which data-centric technologies impact marginalized communities, poverty lawyers can advance our clients’ goals and work toward digital justice for all.

Michele Gilman is a poverty lawyer, Venable Professor of Law at the University of Baltimore School of Law, and 2019–2020 Faculty Fellow at Data & Society.

--

--

Michele Gilman
Data & Society: Points

Venable Professor of Law and Associate Dean for Faculty Research and Development, University of Baltimore School of Law and Affiliate, Data & Society.