Data Assets and Repugnant Markets

Elizabeth Eagen
4 min readAug 15, 2019

--

Acquisition and management of data assets is an enormous driver of social change. Recently I’ve been thinking about barriers (outside of sheer resource constraints) that prevent nonprofits from making more out of their data. There are many data scientists interested to apply their skills to social justice organizations, who could take a look to see where that data could have a meaningful second or third life. There’s also a ton of proposals out there for organizing revenue streams around the very real frontline expertise of nongovernmental organizations. And organizations know that even though they don’t hold the kind of “big data” that might appeal to massive systems analysis, they hold high-value data: the frontline, crucial knowledge of what’s happening, that can illuminate or change the course of a policy in creation. So while money, time, staff, and opportunity might all be factors, I don’t think that’s the whole picture.

I think it’s this: More data seems more useful — but we don’t know when to stop. Or if we do stop, we’re told we don’t understand how the benefits outweigh our concerns. How do you design a meaningful stop, when the possibilities seem so great?

This week Pew Research re-tweeted their 2018 findings on how majorities of Americans find it “unacceptable to use algorithms to make decisions with real-world consequences for humans.” The use cases they explored were all real: criminal risk assessment; automated job screening; personal finance scores.

Inevitably, in conversations around the use of civil-society generated data, we come to a point where activists instinctively want to say no: that something is problematic to collect or share. There are many ways to consider problematic datasets, especially in light of data breaches and the fragile state of digital hygiene on some really sensitive pieces of information. Many of those have technical answers and fixes. But for right now, social justice advocates’ instinct to recoil from some kind of data use: that’s an important thing to listen to. We should listen to our instinctive distaste when it comes to some transactions around data.

I believe this distaste and our discussions about it give guidance to understanding future-facing technology and social justice: about machine learning, artificial intelligence, and the way we allocate resources. It’s important especially to listen to and understand the distaste of non-technical partners to a data project. I see their concerns swept under the rug because they “don’t understand the technology” or trust what we’re going to do with it. I think that’s wrong.

What we “know,” in multiple and non-homogenous ways, is that some things just should not be for sale, or reused, or resold. How we implement that is crucial and cultural. The economist Alvin Roth has examined this knowing as “repugnance as a constraint on markets.” [PDF] For Roth, a transaction is defined as repugnant when some people want to transact in a market, but others believe that it should never be allowed.

When can we classify a market as repugnant? Roth did not tie his definition to an emotional reaction but suggested instead that a transaction is repugnant if some people want to engage in it, but others believe that trade should not be allowed. For example: you can’t sell your kidney in the US, but that’s not the case everywhere. Repugnance markets and transactions change over time. Our concerns make their way into our laws and policies, tells us about how new capabilities affect, change and challenge our old rules. But regardless of this mutability, bringing up repugnance in rooms where decisions are made is a critical part of making human centered choices, about data and technology.

Looking at the question of repugnance gets me somewhere in thinking about important uses for data and technology advances. When we look into repugnance, it can help us understand how to shape a data and technology offering in order to reduce extractive technology and data methodology, reduce coercion, and understand privacy. One of the most important parts of the work technologists and civil society activists have done together over the past few years has been to bridge between the ways in which traditional, first generation civil society organizations have worked, and how data and technology chances that environment. The additive, transformative and sometimes explosive power of emerging technology and data strategies harnessed to social justice issues has taken over our thinking about what it means to do the work of a civil society organization, and what we do with the information we collect as civil society actors.

I don’t see that work stopping anytime soon. Paying attention to our instincts and respecting them, listening to the instincts of others, and carefully turning them over is a part how we include social justice constraints — gut feeling, wrongness, unfairness — in our evaluation and choices. These constraints have an important place in the conversation around ethics, justice, equity, and “rightness,” and should provide meaningful reflection points in the design process.

--

--