5 Questions on Data and Context with Desmond Patton

Catherine D'Ignazio
Feb 21 · 7 min read

By Catherine D’Ignazio with editing by Isabel Carter

Image courtesy of Desmond Patton

Desmond Patton is a Public Interest Technologist trained in social work. He currently works at the Columbia School of Social Work where he founded and runs the SAFE Lab, a groundbreaking research center that examines how youth of color navigate violence online and offline. Combining computational methods and social work foundations, Patton’s work at SAFElab seeks to clarify methods for preventing and intervening in violence. During the course of this work, Patton and doctoral student William Frey developed a community-facing collaborative approach to data collection and analysis, which they describe in a 2018 paper titled “Artificial Intelligence and Inclusion”. (Highly recommend checking it out!)

In our book Data Feminism, Lauren Klein and I discuss the importance of considering the context of data — especially when those data concern people — in the chapter “The Numbers Don’t Speak for Themselves.” Too often, researchers and analysts who would seek to collect data do so from a distance, with little meaningful interaction with the people whose data they are collecting. But Patton’s approach at SAFE Lab offers an important alternative. I caught up with him to discuss it further. What follows is an edited transcript of that conversation.

One thing that really stands out about your work is your collaboration with “domain experts” to help contextualize the data you’re working with. How did the idea for that role come about?

Well, I’d been studying young people and violence for over 12 years, and I had stayed in touch with a group of young men to understand how they were continuing to navigate violence while staying connected to school. I did a set of qualitative studies with these young men and other young men in Chicago to better understand the role of social media in violence, and in particular, gang violence.

At the time, there was little to no research looking at that intersection, and what became clear to me as I moved from the interviews to actual comments on social media was: I didn’t know what young people were saying, period. Because the language was so localized. So it became really clear to me that we needed to create a new approach to social media data that could really grasp culture, context and nuance. For the primary reason of not misinterpreting what’s being said.

What we have learned is: there are a lot of bodies of people that are surveilling folks on social media and making decisions about human behavior based on social media interaction with little to no context. And they’re not trying to get context. Their approach has just been null and void, and we don’t want to be a part of that process.

So for about four or five years now, with any kind of social media study that we’re doing, we employ young people from the community from which the data comes. We meet and partner with community-based organizations to identify young people who are currently or formerly gang involved, who’d be willing to participate, who wanted to give back to their community, and who wanted to prevent violence themselves, and we employ these young people in our lab. We train them in how to use our annotation system, and they look through a portion of our social media data. Then the algorithms get trained on [their notes].

The goal is for an algorithm to think like a 17 year-old African American man on the South Side of Chicago, so that it is able to discern between a Facebook post being a quoted lyric or a potential threat, that kind of thing. This process is extremely important, and as you can imagine, it’s not easy.

What recommendations do you have for others who are hoping to do similar work that could alleviate some of that difficulty?

I think people should think very critically about working with communities before they do it. What we’ve learned is that life is still happening for those communities and people. A lot of the time things would happen. We had one young man whose mom went to prison. He had to take over the family and make more money, so he had to balance supporting the work he was doing with us with the need for real solid money. We had to work with those challenges, and it happened a lot.

People should think very carefully about how to best support those domain experts. That’s something we are growing into. For example, U.S. social workers are realizing that there has to be a lot of upfront support, so we only worked with these experts remotely for a year. The next year we hired a post-doc who lived in Chicago, and that made all the difference. Because they had someone that could directly support them on a day-to-day basis, that changed the relationship. [It affected whether] they made it to work at all.

How do those domain experts feel about this work?

They thought about it quite differently than we did, but I think some of the things that came up were the importance of being able to give back to the community and being able to help violence prevention with technology. They became aware of the impact of social media communications and that they could be misinterpreted. That kind of awareness emerged during the process.

They were not aware that they should be seen as experts, and they were not aware that they could leverage their expertise in this academic way. And there are people that think with this annotated data all the time; they are the ones who are making decisions on what to take down. Why couldn’t these young people have that type of role as well?

How do we work with people in the community — people who may be coming out of gangs, people who are looking for employment opportunities — to be able to leverage their life experiences as expertise, and to be able to work at places like Facebook or Google? So what I’m really excited about is that, I think at Facebook and Google and Apple, you don’t have to have a college degree. What might employment look like in this space where you can leverage your expertise in a way that really helps the platforms centralize and understand how their platforms and technology operate in communities of color as well?

You were very transparent in the “Artificial Intelligence and Inclusion” paper about who you will collaborate with. How did you make the decision to collaborate with violence prevention organizations and not with state police or law enforcement?

We’re social workers. We have a code of ethics, and we lead with that code of ethics. And one of our codes is to do no harm, and we don’t have good examples of police leveraging social media for good. So the code dictates who we collaborate with and connect with.

We get questions all the time about if we work with the police, and at this point, I say that we have not. At some point, we’ll have to, because there are times when the data is threatening and needs police intervention. That’s the reality of the work. But we’ve only leveraged retroactive data, so typically, it’s not an issue. But as we’re moving forward in the work, we’re moving into more real-time analysis with some of our organizations, and of course, we’ll have to figure that out.

In your ideal world, what would be the relationship between social work and data science?

I’m so glad you asked that. There’s a program at Columbia called the Collaboratory Program that is about connecting data science to social science and the humanities. A colleague from social work, and a data science colleague, and I just got a grant to create a new course on data science for social good. So we’re creating a semester-long course that is a part of our new Emerging Media, Technology and Society program at Columbia Social work.

It’s about data literacy, and social workers being at the table. In the first semester, students would take a course on coding and data literacy. And then in the second semester, they would do a project-based course with a data science student or computer science student where they would work together on a project where they can leverage their social work expertise to help people think about an issue or a problem. So for example how do you know what to best optimize in an algorithm? That’s something a social worker would know really well in terms of the tech interventions that have been done, and who all should be included, and what’s missing.

With my colleague Dr. Courtney Cogburn, we created this new track at Columbia Social Work that’s specifically about technology and social work; and we want social workers working at tech companies. We want social workers at the table assessing tech policy. And we want social workers at tech startups that are community-centered. That would be the goal for me: for every social work school to have a tech track. I think Columbia is the first to do it.

You can follow Dr. Patton on Twitter @DrDesmondPatton and check out his TED Talk, They are Children: How Posts on Social Media Lead to Gang Violence.

DATA FEMINISM

This publication showcases an edited selection of…

More From Medium

More from DATA FEMINISM

Related reads

Related reads

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade