Data 4 Black Lives: In Review
Data as protest. Data as accountability. Data as collective action.
This January, I had the opportunity and privilege to convene at the esteemed campus of MIT for the second annual Data for Black Lives Conference. Data for Black Lives is a group of activists, organizers, and mathematicians committed to the mission of “using data science to create concrete and measurable change in the lives of Black people.”
The goal of this collective is to not only highlight areas in which data analysis and predictive analytics can benefit Black people, but also call out areas in which and digital technology and algorithmic decision making can do immense harm.
There were a few break-out sessions and presenters that stood out to me:
We Are the Leaders We Have Been Looking For: Organizing Algorithmic Accountability
Moderated by: Yeshi Milner
As algorithms and automated decisions continue to impact every aspect of our lives, in Black communities nationwide these decision-making systems animate already existing structures of oppression and inequality. From risk assessments to school enrollment algorithms to child predictive analytics and the management of healthcare and social services, the role of algorithms is set only to increase. As the impact of these systems expands beyond Black and low income communities, our heroes will be the people who have been fighting and winning against these systems all along. In this opening panel we will be discussing how leaders nationwide have been organizing to reduce the harm of these biased black box systems and discuss how activists and scientists can enlist algorithms in the fight for racial justice.
My Key Takeaways
- We must hold algorithmic decision makers accountable for the outputs of the algorithms and make them provide visibility into how those decisions are being made. One great example of this is creating public data trusts or algorithmic ‘nutrition labels.’ Public data trusts have been explored in the Open Data community and takes the concept of a legal trust and applies it to data. In a data trust, the trustors may include individuals and organizations that hold data and grant some of the rights they have to control the data to a set of trustees. This is particularly impactful for the Black community who over-index on the use of social media, but do not have clear insight on how their data is used by social media companies.
- Uplift and leverage tools that have already worked in one locality. Featured projects / organizations that exemplified this: Our Data Bodies Project. Our Data Bodies project has just released a Digital Defense Playbook “focused on data, surveillance, and community safety to co-create and share knowledge, analytics, and tools for data justice and data access for equity.” This project is three years in the making and contains a blueprint for community organizers looking to advocate on behalf of community safety without having to create materials from scratch.
- Digital literacy extends to concepts like privacy, surveillance, and informed consent. We must not assume that people are digitally literate, especially inner city youth who are disproportionately impacted by digital security and surveillance tactics.
- AI is far from perfect, and the amount of trust that we put into it must be done cautiously. Projects like the Algorithmic Justice League & Not Flawless AI show the clear bias against dark-skinned, black women who are frequently misclassified, or unidentifiable in mainstream facial recognition software. Imagine what ramifications this will have for automation technologies that are unable to distinguish black / black female faces.
- If corporate / political interests won’t create equitable algorithms, we must create our own alternatives. This is demonstrated by R3 Score an alternative to a standard background checks that makes it easier for banking professionals to assess the riskiness and financial capacity of customers with criminal records.
- When designing these alternatives, we should “start with visioning and define the material conditions that are necessary to manifest [your goals].” This means starting with the target audience in mind (in this case the Black Community) and focusing specifically on what exactly is needed to address the inequities and not limit ourselves by tools or solutions that don’t currently exist.— Tamika Lewis
Going Beyond the Symptoms
One of my largest critiques of this conference, along with any (historically white and/or perceptively elitist) institutionally-backed attempt at real restorative justice and social equity, is that it is inherently counter intuitive and undermines the existing power structure. Therefore it is never in the best interest for these institutions to truly embrace radical reform for this may ultimately mean their undoing.
Many of the proposed solutions were remedies to symptoms of systemic inequality and white supremacy. And as with many social ills we see today, they are just indicators of more entrenched problems. Exacerbating the problem, the established corporate, political, and economic majority interests are driven by a growth mentality. Very rarely do they step back and say no: some innovations and opportunities aren’t worth pursuing because of the social externalities. And even more rarely do they relinquish their autonomy and dedicate meaningful resources (i.e., proportionate to the level of social externalities or relative benefit) to more restorative, compensatory efforts.
During the closing session, the panelist did hit on some revolutionary ideologies such as guaranteed minimum income. I believe in order to create a more equitable society, we must examine these and other wealth (and knowledge) transfer mechanisms in addition to addressing symptoms of the historical inequity.
I found the conference to be an immensely enlightening and inspiring meeting of minds. I enjoyed this forum included a multi-disciplinary approach and encompassed data science professionals, researchers, policy markers, non-profits, grass root activist, public officials, and corporations.
What I appreciated most about this event was the ability to network with those working on projects real-time who are in need of data services. Many of the off-shoot conversations centered around how data can be used to address topics that weren’t explicitly covered during the scheduled programming. This included using data to address inequities in the cannabis industry, human trafficking, immigration, wellness, etc.
Overall, my biggest take-away is start where you are with those around you. As coined in the opening panel: “We are the leaders we are looking for.”
Fortunately, we are in a pivotal moment in time where we can stand and question the narrative we have been sold behind AI. It is not too late to question the intended benefits and more closely qualify the externalities.