What We Mean When We Say #AbolishBigData2019

Despite clear evidence that algorithms, models, and data schema serve as vehicles for bias, data-intensive technologies mediate more and more of our individual and collective lives (Noble, 2018). Datafication, the reliance on digital data and prediction to perform important, societal functions, names the reality of technological change and a dangerous ideological assumption about the nature of that change: the view that technology is inevitable, beneficial, and scientific (van Dijck, 2014). In provision of financial services by industry, in assessment of risk by the criminal justice system, in the training of computer vision for autonomous vehicles, and in many other sites of civic import, data-intensive technologies automate and exacerbate inequality in racialized, minoritized, and precariatized communities (Eubanks, 2017). That is to say, in our communities.

Motivated by the work of community-based organizations and researchers including Data 4 Black Lives, The Bronx Defenders, Stop LAPD Spying Coalition, Measure, Our Data Bodies, Urban Institute, IRISE and others, we propose that the consequences of datafication demand new approaches to research and collective action. In March of 2019, researchers and activists gathered for “Datafication and Community Activism: Redrawing the Boundaries of Research,” a two-day workshop in the Department of Informatics at the University of California, Irvine. We gathered to listen to voices frequently marginalized by our institutions, to write about the relationship between knowledge and action, and to think together about the state of the art in community-based research. We committed to respond to a challenge laid down by Yeshimabeit Milner, Executive Director of Data for Black Lives: to make bold demands to powerful organizations and institutions and to hold ourselves accountable for a better collective future.

Our workshop, partially captured using the hashtag #AbolishBigData2019, drew out both incommensurability and agreement on a number of critical questions. Some insist that cutting-edge datafication draws its power from a long history of state violence. Others have used community knowledge and digital data to mobilize resources for pressing collective problems. Our work does not depend on unanimity. We present here a brief summary and synthesis of our thinking together using four themes: future work, harm, self-defense/empowerment, and abolition.

Future Work

We imagine the workshop as a starting point, a way to bring fellow travelers into conversation. Before such conversation can begin, we must acknowledge that our civic and intellectual institutions are themselves deeply implicated in systemic racism, classism, sexism, and other forms of oppression related to class, gender, abledness, nation, and/or size.We cannot achieve the complexity of thought needed for a powerful response to datafication until our institutions and organizations welcome every kind of person. In particular, we demand space in the broader tech scene and the academy for the leadership of black women. In whatever form it might take, we orient our future work toward the commitment that data must be used to measure systems, not people.


Marginalization, gentrification, and police violence accompany the delegation of decision-making to technology, but we are often forced into frames of reference that center a putative social good. Instead, we argue for the centrality of harm. Harm is multivalent: it directs attention to individual and collective injury and to responses to threat and wrongdoing. Harm can be the sacrifice of one human need to serve another, equally vital one, a demand that we surrender autonomy in exchange for medicine or shelter, for example. Powerful systems distribute harm to differently valued bodies according to known historical trajectories of violence, exploitation, and profit: we are always already enmeshed in non-innocent infrastructures (Murphy, 2015). Recognizing the potential of both action and inaction to do harm, we call for technological humility and for an understanding attuned to scale, from the global scale of capitalism to the intimate registers of our familial bonds. A tool that seems obviously fair, accurate, or transparent in one setting might cause terrible harm when viewed from the perspective of the person or persons it injures. While some in our communities insist rightly that data-intensive technologies intentionally do harm and may never be liberatory, others argue equally rightly that the specificity of our social location calls on us not just to engage with these technologies, but to actively develop them. In our understanding and in our action, we seek wherever possible to reduce harm. From this perspective, data are relational, drawing us into intricate and complex mutual responsibility.

Self-Defense Is Empowerment

Self-defense and empowerment are different stages in a process of long-term movement building. To recognize a tension in the use of the terms self-defense and empowerment, we insist that understanding of risk and action happens at the individual, organizational, and societal level, and all places in-between. Data modulate differential capacities for obfuscation, flattening, erasing, and undoing personhood. These same capacities elsewhere afford visibility and connection. Ultimately, datafication points to a way of knowing that elevates the trappings of objectivity and subordinates the embodied experiences of people, their lives, their stories, their humanity. We consider it urgent to trouble the prevailing notion that digital traces of people are somehow equivalent to people themselves. The tools we use to protect ourselves and our communities from datafication differ instance to instance, but we will always deploy tactics and strategies according to our own values. Data can be used to tell stories, but our stories are not data.


Finally, thinking of responses to datafication in our communities as part of the historical tradition of abolition puts in mind a centuries-long history of resistance to chattel slavery, to the still incomplete project of emancipation. Datafication gives extractve capitalism the aura of instantaneity and the power of speed. Abolition is a process of slowing things down in order to end them, to decelerate and take notice. Talk of artificial intelligence and algorithmic efficiency occludes the continuing presence of life. What kind of life will it be? We insist on an abolition that nurtures imagination and creativity instead of merely negating oppression. We demand futures steeped in daily practices of healing and joy. Data are but one face of a tactic of finding new ways to live.

We invite scholars, journalists, graduate students, artists, adjunct lecturers, para-academics, public scholars, community activists, data/information professionals and anyone else interested in thinking through what it might mean to remake the relationship between activism and research to join our mailing list.


Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York, NY: New York University Press.

Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. New York, NY: St. Martin’s Press.

Murphy, M. (2015). Unsettling care: Troubling transnational itineraries of care in feminist health practices. Social Studies of Science,45(5), 717–737. https://doi.org/10.1177/0306312715589136

Stop LAPD Spying Coalition. (2018). Before the Bullet Hits the Body — Dismantling Predictive Policing in Los Angeles. Retrieved from https://stoplapdspying.org/before-the-bullet-hits-the-body-dismantling-predictive-policing-in-los-angeles/

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. https://doi.org/10.24908/ss.v12i2.4776

This essay is a collaborative, original intellectual work of all Datafication and Community Activism Workshop Participants:

Myntha Anthym, IRISE
Elvia Arroyo-Ramirez
Kaylan Baxter , USC
Paulette Blanc, Measure Austin
B.B. Buchanan, UC Davis
Matthew Bui, USC
Sumandro Chattapadhyay, Center for Internet and Society
Marika Cifor , Indiana Bloomington
Roderic Crooks, UC Irvine
Taylor Cruz, CSU Fullerton
Mia Dawson, UC Davis
Jay Dev, MIT
Joan Donovan, Shorenstein Center at Harvard Kennedy School
Jamie Garcia, Stop LAPD Spying
Julia Gelfand, UC Irvine
Chris Gilliard , Macomb Community College
Ben Green, Harvard
Harry Hvdson, USC
Lilly Irani, UCSD
Hamid Khan, Stop LAPD Spying
Nadia Khan, Stop LAPD Spying
Jenny Korn, Berkman Center
Lisa Martinez, IRISE
Bill Maurer , UCI
Carolina Mayes , UCSD
Amanda Meng , Georgia Tech
Mélanie Millette, Université du Québec à Montréal
Sara Milkes, UC Irvine
Florence Millerand , Université du Québec à Montréal
Yeshimabeit Milner, Data 4 Black Lives
Akua Nkansah-Amankra, USC
Benedict Salazar Olgado, UC Irvine
Aitanna Parker , UCSC
Lucy Pei, UC Irvine
Kylie Peppler , UC Irvine
Winifred Poster, Washington University is St. Louis
Stevie Rea, UC Irvine
Allissa Richardson, USC
Charmaine Runes, Urban Institute
Mariella Saba, Our Data bodies
Zithri Saleem, U Washington
Gwen Shaffer, CSULB
Yvonne Sherwood , UCSC
Janine Slaker , Michigan State University
Toby Smith , UC Davis
Meme Styles, Measure Austin
Tonia Sutherland , U Hawaii
Lolita Tabron, IRISE
Rebecca Widom, The Bronx Defenders
Stacy Wood, Pitt