Algorithms and Inequality — An Interview with Virginia Eubanks

In this episode of the Masters of Data podcast, I sit down with the author, educator and activist Virginia Eubanks for an interesting discussion on the current climate of the US social justice system, the data that feeds it, the inequality it, unfortunately, fosters and most importantly, how to effectively combat it. Virginia has been on a long crusade to raise awareness about how our digital tools are continuing and exacerbating the problems we already have around poverty and inequality. She is an Associate Professor of Political Science at the University at Albany, SUNY and is the author of the book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. She sits down with me to discuss these things as she shares from her experience in dealing with these issues on the front-line and shows how the technology that we entrust ourselves every day can be used to not only combat inequality but to propagate and prolong it as well.

Virginia begins the conversation by sharing with me about her background and how she got involved in this type of work, to begin with. As an activist for over 20 years in both community media and community technology centers, she shares how growing up in the Bay Area in the mid-90’s set her up in the tradition of thinking through what it means that access to high-tech tools is unevenly distributed. She shares how from a young age she helped in building community tools in collaboration with the communities that would be using them yet soon discovered that many in the communities challenged the idea that they lacked access to technology. As she and I discuss, she remembers having others share with her that the idea that they lack technology is not right. Rather, they have excessive technology in their lives every day, they just interact with it in the low wage workplace and in the public assistance system and in the criminal justice system. This reality shook her and began her down a path of increasing her understanding of how technology and the tools and systems being built to better people’s lives (including the underprivileged) in reality negatively impacts their quality of life as it breeds and prolongs inequality.

Outside of touching on the reality of this problem, the two also discuss specific ways the lifestyles and opportunities of so many Americans are being impacted through data and technology. As she shares, “The folks who are the real experts are the folks who are facing sort of the least controlled technologies in the most direct ways. Like in ways that it really is impacting the quality of their everyday lives, that it’s really impacting their ability to meet their basic human needs for things like shelter and safety and food and family integrity.” Seeing the reality as well as the heartbreaking outcome of all of this, she began to ask what people, policymakers, analysts and economists are making the systems and technologies and what conversations need to take place to keep people from being continued targets from these things created to help assist the underprivileged. What motivated her? Well as she shares, “Not only because it’s the right things to do, it’s like the morally right thing to do, but because they have better information.” The information they receive and the information they use can often be disconnected. So the issue is a lack of consistency in the information being pulled and used in these processes. Additionally, many Americans have the choice (and the right) to keep their information private or concealed, and this is more easily done because of their social or economic status; yet for many people technology is now directly impacting their lives in very profound ways, in ways that they can’t control, which is the motivation for her recent book. She notes, “The reason I framed the book the way I did is that I just think the kinds of voices, the folks who see themselves as targets of these systems have largely been ignored both in the ways we define the problems and in the ways we imagine solutions. I think that that’s a mistake. Like not only again, like it’s not just immoral. It also is just empirically wrong.”

So what can be done to create better awareness to the problem and address the issue? Well as Virginia and I discuss, there are multiple things that must happen. First, she wants to make sure that the voices of folks who are most impacted by these systems are centered in the conversation as she sees the inaccurate stories that circle around poverty in the United States and the people being impacted need to have a voice to share their experiences honestly. As she also shares, we not only have a tendency to talk about technology as if the problems are all going to happen in the future, we also have a tendency to talk about it as if it came from nowhere. In reality, technology is built by humans. It carries with it human assumptions and preoccupations. It is a deeply social product and it then loops back to affect the culture that it emerges from. Until we recognize and grapple with this, little can be done to address or alleviate the issues at hand. Sadly, as Virginia notes, these new technologies represent this moment in our history where we decided that the primary goal of social service systems should be to decide whether or not people are deserving enough to get help so that they act as moral thermometers rather than universal floors that support us all, which is something somewhat foreign to the global economy around us. Additionally, the technologies that we’re integrating into these systems actually are making political decisions for us, but often because we think of them as administrative tools we don’t think of them as having important political consequences, but they embody all of these really important decisions. The reality is that until the real impact these systems are having and the use of the data they are compiling is made more aware, these things have little hope of improving and changing and countless people will continue to fall prey to the data-driven inequality epidemic we are facing today.

Outbound Links & Resources Mentioned

Masters of Data Episode

https://sumolo.gs/2CxwH2J

Learn more about Virginia and her work:

https://virginia-eubanks.com/

Read up on Virginia’s blog:

https://virginia-eubanks.com/blog/

Follow Virginia on Twitter @PopTechWorks

Email Virginia at: me@virginia-eubanks.com

Purchase Virginia’s book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor:

https://us.macmillan.com/books/9781250074317

Key Takeaways

  • This idea that we lack technology is not right. In fact, we have a ton of technology in our lives every day.
  • We interact with it in the low wage workplace and in the public assistance system and in the criminal justice system and in our homes if we live in public housing and in our neighborhoods.
  • So it’s not really true to say we don’t have interaction with these tools, but the reality is most of our interaction with them is actually pretty negative; specifically about the welfare system.
  • Technology is impacting the quality of their everyday lives; it’s really impacting their ability to meet their basic human needs for things like shelter and safety and food and family integrity.
  • We need to be talking to policymakers and to data analysts and to the economists who are building these models, but we also have to be talking to people who see themselves as the targets of these systems.
  • There’s a tendency for the middle class or upper-middle class, to think technology’s kind of benign. It’s just part of a choice we have but technology is now directly impacting people’s lives in very profound ways, in ways that they can’t control.
  • The folks who see themselves as targets of these systems have largely been ignored both in the ways we define the problems and in the ways we imagine solutions.
  • We not only have a tendency to talk about technology as if the problems are all going to happen in the future, but we also have a tendency to talk about it as if it came from nowhere.
  • Technology is built by humans. It carries with it human assumptions and preoccupations. It is a deeply social product and it then sort of loops back to affect the culture that it emerges from.
  • These new technologies represent this moment in our history where we decided that the primary goal of social service systems should be to decide whether or not people are deserving enough to get help so that they act as moral thermometers rather than universal floors that support us all.
  • The technologies that we’re integrating into these systems are actually making political decisions for us, but often because we think of them as administrative tools; we don’t think of them as having important political consequences, yet they embody all of these really important decisions.
  • We tend to bake these political decisions into these technological systems and then we pretend they’re not political decisions.
  • Spending our time, our resources and our smarts addressing a 4% problem rather than a 350% problem, is an issue. It’s similar to having a solution and then going and looking for a problem. It’s not necessarily what the real problem is.
  • We can choose to do things differently (in the US); in tons of places around the world the kinds of things like not having enough food to feed your family, like living in a tent on the street for a decade, or like losing your child to foster care because you can’t afford a medical prescription, are seen elsewhere as human rights violations.
  • The fact that we see them increasingly in the United States as systems engineering problems really should make us very concerned about the state of our commitment to caring for each other and to working as a political community to provide a just basic minimum a line below which nobody is allowed to fall.