Cyberbanging: how violence escalates through emojis, hashtags and 280 characters or less
On Saturday, September 23rd storytellers, designers, makers, hackers, educators, caregivers, community organizers, activists and creatives working in all fields will gather for a special one day event at the Columbia University School of the Arts entitled Story I/O. Together they will design and prototype simulations to de-escalate violence and provide empathic care.
Lance Weiler founding director of the Columbia University School of the Arts’ Digital Storytelling Lab sat down with Dr. Desmond U. Patton, director of SAFE Lab for a conversation about trauma, violence, grief interrupted and the challenges of cyberbanging in the digital age.
Lance Weiler: Can you give us a sense of your background and what lead you to the Columbia University School of Social Work?
Dr. Desmond Patton: I’m a social worker. I focus on families and community social systems and have a very big interest in young people, particularly young people of color who live in urban areas and how they navigate violence. I have a PhD in social service administration from the University of Chicago and I was initially interested in understanding how young people transition to high school. I was sitting in high schools in Chicago and most of the kids I was following were not coming to school because of gang violence and family members or friends being shot or killed. And so I turned direction and began to study the extent to which violence impacts educational achievement among young black men. From there I became aware of how young people began to navigate violence through social network platforms and the extent to which communication over social media can be an early warning sign or an early indicator of violence in the community. And so from Chicago I went to the University of Michigan where I started to research what I call ‘cyberbanging’, which is a play on ’gangbanging’, and mirrors or mimics gang violence that happens on the internet. I spent three years at the University of Michigan writing papers and doing discursive analysis of Twitter data from gang involved use. I then came to Columbia in 2015 and partnered with Kathy McKeown who was then the director of the Data Science Institute, to advance the research by applying natural language processing (NLP) techniques to analyze large amounts of data and identify content that may be problematic on various social media platforms.
LW: Can you talk a little but more about that moment that you decided to change directions and how you recognized this in your research?
DP: Yeah, it was just kind of an ah-ha moment, really. When young people talk about social media it’s a part of how they navigate violence and how they observe violence. And with social media becoming a bigger topic in larger media regarding violence, the two topics just kind of collided and became an ah-ha moment for me in terms of the work. Because what became more clear to me was that young people are spending more time on social media, an enormous amount of time on social media, as apposed to physical spaces — and if we are just focussing on physical space, we are missing out on conversations and interactions and engagements that impact how young people navigate and occupy space in the physical world. So that became really clear to me, also through conversations about how these things were discussed in popular media.
LW: So that lead to the formation of SAFE Lab. Can you share the mission of the lab and the type work you’re doing?
DP: SAFE Lab is a research initiative where social work students and colleagues in social science and data science come together to work on critical social problems that have a tech component. We’re particularly interested in issues around violence, community violence, gang violence, and gun violence in urban settings. We leverage and support resources within this space to produce research that can identify additional risk factors that may predict violence, identify solutions to intervene and prevent violence, and to leverage the strength of communities as well. So our mission is to use research to support communities to intervene and prevent gun violence. We do that by using social media, by doing qualitative work where we do interviews and also info-graphic work. Also working directly with data scientists who can create algorithms to predict and prevent content that may be problematic in online spaces.
LW: With that work would you say that there’s a gap in the use of these technologies in the communities? I know that you’ve talked to me in the past saying there’s a lot of organizations that work on the ground and they’re “gapped.”
DP: I would say there’s a gap in the use of technology in community based organizations. There’s a gap in leveraging social media in social practice as well and so it’s not the idea that these organizations aren’t aware of what’s happening on social media. Often times they just lack the capacity and the resources to be able to leverage the resources and the platforms in this space. And so what we want to do is provide them with access to tech resources and tech research in either a free or an affordable way so that they can use what’s happening in online spaces to intervene and prevent violence in real time. And this is not to replace the work they do currently on the ground but to augment or to enhance and advance the work they’re doing in these spaces.
“Instead of the story being solely about violence and who’s committing and who’s engaged in violence, we’re getting stories about trauma, and stress, and grief, and love, and happiness, and romance, and fun, and enjoyment, and money.” — Desmond Patton
LW: Storytelling is at the heart of what you’re doing. Can you give a sense of why you feel it’s import?
DP: I think what’s really interesting about social media is that it allows for the democratization of storytelling and sharing, particularly for youth who live in under resourced or marginalized communities. What we clearly began to understand is that when you’re looking at social media posts you’re really getting bits and pieces of stories. And when you look at individuals you get a really in depth and nuanced understanding of someone’s story. The story we’re getting is what life is like in violent communities. And that’s a really complex story. The stories that we see are different from the stories that we are told about young people in violent communities via popular media. So instead of the story being solely about violence and who’s committing and who’s engaged in violence, we’re getting stories about trauma, and stress, and grief, and love, and happiness, and romance, and fun, and enjoyment, and money. All of those pieces help us understand some of the root causes of violence and the places for intervention and prevention. And I think because we are not engaged in social media in a way that allows us to get those stories, the popular media have been missing these more nuanced, robust and more culturally appropriate stories about young people of color in marginalized spaces.
LW: Can you speak to some of the stories that you’ve discovered or some of the things that maybe even surprised you?
DP: Yeah I think what’s really surprising is the amount of trauma and grief that young people experience and how willing they are to talk about that. Often times the narrative from sociological literature is that young people are vulnerable and they engage in behaviors and modes of behaviors to protect their vulnerability. So, as a code of dispute that dictates how people behave, they need to engage in these tough-walk-and-tough-talk mannerisms to protect themselves. And yet, contrary to that is what we see on social media where people are being highly vulnerable and talking about deep pain, and deep trauma, and deep psychological states that really puts them at risk for future violence or injury. And yet they feel comfortable and willing to share those stories to people in their network. I think that’s been most shocking and surprising. In addition to that, what’s really clear is that young people do not spend most of their time talking about violence. What’s shocking or what isn’t known or understood is that most of the story is about joy and fun and happiness and things that teen’s talk about, girls and guys, that kind of thing. I would say about 4% of the communication we see is aggressive or threatening. I think people should really understand and know that.
LW: You’ve told me about this notion that you’ve come to realize is ‘grief interrupted’, can you talk about that?
DP: Yeah. So, often times people are using the platform to express conflict, trauma and grief, and yet because they’re on open platforms or on public networks or in ‘network society’, anyone who has access to that person’s comments can make a negative comment about that grief. And so often times what we see in our data is that young people are using the platform to express grief and someone from an opposing crew or clique or gang are making a derogatory comment on that grief or disrespecting that grief. Or, because data content is not deleted, it’s always there to be shared and reshared and retweeted. I mean, you can experience death six months ago and make a comment and someone can find that comment and reshare it again. So you never get a chance to really get away from it. You have to always confront it. Not that there’s an idea that people can really get away from grief, but people can develop skills to process and manage grief. And yet with social media in some cases those hard and tough and raw emotions that you express in the bowels of grief can be resurfaced and come back to haunt you and be thrown in your face — and you have to figure out how to deal with it and process it again. So it doesn’t give people who already live in traumatic spaces time to really separate themselves from that space.
LW: If we could go under the hood a little bit and talk about the way in which you’re working with NLP (natural language processing) to gain these insights. Can you talk about the process?
DP: So, the SAFE Lab team has a team of social work students, researchers and computer scientists. The social work side of the house annotates text and images, and what that means is that we’re looking for offline characteristics in texts and images; we’re looking for connections between users, who they’re connected with on these platforms; we’re looking at the emoji’s and the hashtags for different meaning; we’re trying to unpack the language in context and nuance. We hire community experts, these are people who are from the community from which the data comes from to unpack and interpret text. Then we finally label data, so each text or image is labelled based on what we deemed to have emerged from what’s happening in that text or image. So for example we may label a tweet aggressive based on our understanding of that, or we label a picture as grief based on what’s being unpacked in that particular picture. After everything is labelled we then send those data sets to the data science house side of the team and then they use NLP to include machine learning and neural networks to detect and predict the content in those texts and images with the goal of being able to alert community based organizations. So they can use the content to either prevent future interactions, predict the understanding of trauma and grief that happens earlier or feeds aggression, or be able to intervene in a particular event before it becomes aggressive. So when you see highly aggressive posts you are able to chat with those individuals before it becomes problematic. That’s pretty much the inner workings of the SAFE Lab.
LW: And so when you’re going through and you’re trying to find meaning within these tweets, I imagine there’s a lot of language and terminology that needs people who are directly familiar with that communication — that’s why you brought in your summer fellows. Can you talk about the fellows? Could you also talk about collaborating with younger people in and around SAFE Labs, maybe give a sense of how that’s leading towards the development of these digital scholars labs and what the goal is there?
DP: So we receive funding from the magic grant, which is sponsored by the Brown Institute at the Columbia Journalism School to develop a digital scholars lab. The purpose of the digital scholars lab was twofold: to educate people of color and at risk communities around the consequences associated with their digital footprint on social media; and then to provide them with access and training and social media research so that they can be a part of making solution for the things that are happening in their own communities. So in the summer we ran the pilot lab, where we had four young people from Brooklyn, all African American young men and woman to be a part of our lab for six weeks and they worked on three projects; they were either analyzing texts and tweets, analyzing images, or working to develop or adapt an emotional intelligence chatbot. The goal was to give them experiences in this research space to hopefully encourage them to think about technology and social work as potential careers fields. And when we ask them to reflect, all of our students say they learned a tremendous amount from their research, but more importantly they thought about their own digital footprint and ways they can change their own digital footprint. They’re thinking about changing their careers too; we had one young woman who wanted to be a beautician when she came in and when she left she said she was now interested in engineering. That’s really because of the overarching goal of the program and we want to be able to develop this before we get to something which is more of a concrete program if you will, where they’re learning how to not only do research but to produce new knowledge and to gain new skills in this space as well.
LW: And where will those digital scholars labs live?
DP: Well, hopefully they’ll be in cities across the country. Of course were working on New York City but I think the goal is to have digital scholarship labs across the country where a community organization could develop a digital scholars lab as a tool for prevention and intervention as well.
LW: So let me ask you, because we’re collaborating in around this notion of a de-escalation room, what excites you about the de-escalation project?
DP: I think number one it’s a different way of approaching the research. We are analyzing stories but I think what’s important here is to get outside of the research. To think about the stories in very concrete ways and very complex ways. And one of the things that we have found in our research is some of the interesting patterns around de-escalation where young people are generally de-escalating content before it becomes a problem. But what do you do with that? And how can we help young people experience that moment in actionable and tangible ways? I think what’s exciting is to get outside of their comfort zones, you know, their screens, their iPhones and their computers, to really kind of manipulate space and manipulate these experiences and to think in very tangible ways about what escalation and de-escalation look like and what can we do about it. So I’m very interested in being able to bring people into a space and be very creative with research. Often there’s a particular way in which you have to do it and I think in this particular setting it introduces a new landscape and new opportunities. It throws out traditional research rules and allows us to be very creative and innovative in this space.
LW: To close and also kind of build upon that, could you just give a reflection on what its been like to collaborate with the Digital Storytelling Lab?
DP: It’s obviously really exciting to collaborate with you and the School of the Arts, in particular our time at Microsoft really showcased to me what the possibilities are. During that time we worked with employees at Microsoft in a hackathon to kind of ideate what a de-escalation room could look like. And I think what really hit home for me was the ability to touch and feel and to play and to see and observe. Even with all the things that we have seen happen on social media, we feel a step removed from it. But to actually play or act out a scene, or experience touching and creating things can help people experience a moment and is really a new and creative way of doing research that we haven’t tapped into in social work. But it makes sense. In social work we’re all about the human experience and leveraging it to create change and create solutions. This allows us to actually touch and feel and look around with all of our senses and that’s a really new and creative way of approaching complex social problems, which I think will be an excellent innovative space for social workers.
Help support SAFE Lab’s efforts by donating to their “Intervene in Gang Violence” campaign.
The Columbia University School of the Arts’ Digital Storytelling Lab (aka Columbia DSL) designs stories for the 21st Century. We build on a diverse range of creative and research practices originating in fields from the arts, humanities and technology. But we never lose sight of the power of a good story. Technology, as a creative partner, has always shaped the ways in which stories are found and told. In the 21st Century, for example, the mass democratization of creative tools — code, data and algorithms — have changed the relationship between creator and audience. The Columbia DSL, therefore, is a place of speculation, of creativity, and of collaboration between students and faculty from across the University. New stories are told here in new and unexpected ways.
Join Columbia faculty and industry innovators as we explore the current and future landscape of digital storytelling.
For more information on upcoming Columbia DSL programs, prototypes and labs make sure to sign up for our newsletter. Plus if you’re interested in connecting with other storytellers, game designers, hackers, makers, educators and fans of emerging technology we’ve started a Columbia DSL community. Finally if you like to partner with us we’re always up for a good collaboration!