On Humane Tech: Gaming and Intersecting Technologies with Dr. Kishonna Gray

Lincoln Center for Applied Ethics
10 min readMay 4, 2023

--

The Lincoln Center for Applied Ethics hosts a bi-weekly conversation “On Humane Tech,” highlighting relevant news in a conversational format with our team. Each week the topic changes, but one thing stays the same — we want to hear from you. Respond to our conversation below.

This week’s topic: Gaming and Intersecting Technologies with Dr. Kishonna Gray.

Dr. Kishonna L. Gray is an Associate Professor in Writing, Rhetoric, & Digital Studies and Africana Studies at the University of Kentucky. She is an interdisciplinary, intersectional, digital media scholar whose areas of research include identity, performance and online environments, embodied deviance, cultural production, video games, and Black Cyberfeminism. Dr. Gray is the author of Intersectional Tech: Black Users in Digital Gaming (LSU Press, 2020). She is also the author of Race, Gender, & Deviance in Xbox Live (Routledge, 2014), and the co-editor of two volumes on culture and gaming: Feminism in Play (Palgrave-Macmillan, 2018) and Woke Gaming (University of Washington Press, 2018).

Karina Fitzgerald: To start, for our readers, would you share a bit of your background, and how you came to where you are now?

Dr. Kishonna Gray: So when I started graduate school in August of 2005, Hurricane Katrina had just hit, and a lot of us were immersed in what was happening and what we were seeing on the news. I wanted to try to make sense of the mediated narrative of who’s a deserving and undeserving victim, but I was even more fascinated with how people were using the comment sections of news reports. People were trying to connect with family members who were displaced, because, if you recall, hurricane refugees had been sent to all different parts of the country, and there wasn’t really a coordinated effort at the national level. Nowadays, we would call it social media analysis, but the term “social media” hadn’t really emerged yet.

I went on to do my PhD in the ASU School of Social Transformation, and by that time the conversation around Hurricane Katrina had really changed; there was this heavy realization that people would not be returning to the gulf, and that New Orleans was going to look different. There was almost this “Disneyfication” of New Orleans, in which corporate actors were coming in and changing the landscape. And I even remember the commentary from legislators essentially being, “we couldn’t clean up New Orleans, but Hurricane Katrina did.” The overall response and language surrounding the issue was heavily racially-coded.

As a baby graduate student, a baby researcher, it was tough on my soul to be so immersed in these stories as they were happening in real time, and to be sorting through all these narratives around Hurricane Katrina. I was really thinking about the ethics of my research and it felt exploitative; there were a lot of conversations that I couldn’t contend with.

And so my advisor finally suggested that I explore something else, and he asked what my other interests were and what I did for fun. I was scared to tell him I was playing video games, because there I was, a graduate student wanting to be seen as a serious scholar; I wanted to say that I was reading books in the library in the middle of the night. But I decided to be honest and said I was playing Halo and Call of Duty. His response was really how I try to mirror and model my mentorship now, because he looked at me, and said, “That’s really cool. I’d love to know more about that.” I felt so affirmed in knowing that my interest and curiosity was justification enough to do my research in gaming. One of my other professors was Lisa Anderson, who became the chair of my dissertation committee, and she argued that I could apply the same research questions to my new area of interest. And she was right, because I was able to then ask those same questions to then look at things like digital cultures and online media in relation to social inequalities, oppression and domination.

As I transitioned into gaming studies, I wanted to see if other people were experiencing what I was experiencing when I went to play games online. When other people online could hear that I sound like a woman, they would engage in a lot of sexism — I can’t tell you how many sandwiches I’ve made because people said, “Kishonna, go make me a sandwich.” And then, when people would eventually learn that I’m Black, which took longer because I use Standard American English, the racialized or racist nature of those conversations with them was very interesting. So I started on this qualitative project, this ethnography, and that culminated in 2020 when I published Intersectional Tech. It was a 10-year ethnography where I was able to kind of tell this beautiful story of that’s not all doom and gloom; not all oppression and domination. There are beautiful communities that thrive inside these online spaces that connect and find one another, that support and affirm one another.

A panel of gamers in headsets look at gaming monitors while intently playing, with their gameplay projected onto large screens behind them.

Dr. Erica O’Neil: Your research has shifted over time from a discussion of how women and people of color are often marginalized in digital spaces to interrogating the affordances that tech can provide to Black communities. So I was hoping that you could expand on that a little bit.

Gray: Thank you for asking that question. Once I got into this research, it was more than just video games. The act of playing a video game actually became the third thing that was significant and important in that space. The first became the ability to build these communities based on what the technology provided. In the early days when I started this research, we didn’t have the option to create parties and chats; basically, when you go into a game, you’re in a public lobby. Anybody can be in that space that can hear and interact with you, and those are where the spaces were really toxic. And what would happen is minority populations in these spaces mute themselves to avoid harassment. We’re paying all this money for these technologies, but we’re not able to use them and experience them in a way that everybody else can. But then, game developers started implementing these party chats, where you could create a group of specific players and not have to experience hostile behavior that we often see in gaming spaces. So that was a really cool, you know, technological affordance.

Then, and I talk about this in Intersectional Tech, I started observing how all the innovations of technology really impacted and significantly changed the communities inside the space; sometimes in good ways, but sometimes in bad ways. For instance, with chat rooms and parties, on one hand it was a good thing, but on the other hand it allowed the toxic spaces to continue to grow even more toxic. We gave them an out, because we displaced ourselves, and suddenly no one was complaining about sexism or racism anymore

At first, gaming companies didn’t address these issues — but then streaming technologies started to come into play. All of us suddenly had a camera on our xbox, and now I can’t hide what I look like. For many of the Millenials, or even Zillenials, there was a sudden shift from private party chats to this exposure of gaming culture that was so racist and sexist and homophobic. But Millennials and Gen Z utilize social media in different ways than Gen X. When I used to experience hostility in these online spaces, I just turned my Xbox off. Gen Z will record it and upload it to social media to bring visibility to these issues. And so these companies have had to pay attention to online harassment. That’s one of those beautiful affordances of technology; it makes us subject to more harm, because we’re hyper-visible and vulnerable, but it also brings more attention to the issues and opens conversations with people in the industry.

O’Neil: I have one more for you, and it’s more of a high-level question about what sort of design aspirations or interventions can make digital spaces safer and beneficial to affinity groups?

Gray: I love this question. One of the things that I’ve always interrogated has been reporting systems. Reporting systems for inappropriate conduct are relatively new, and what those systems look like reveals a lot about a company and their values and who they value as their stakeholders. So, for instance, all these games have a way for me to report cheating, so that lets me know that these platforms care about cheating and that cheating is an issue. But gender-based harassment doesn’t exist on most of these platforms. Race-based harassment doesn’t exist like in the reporting systems, and to me that says they do not care. It wasn’t until after Gamergate, in which the harassment of women and queer people was really highlighted in the community, that some of the platforms did implement methods for reporting harassment, but even still that’s not across the board. For me, that’s a pretty significant indicator of the values within a space; who do we care about, and are we designing for it?

I think another design concern, if we were to look at representation in the games themselves, is what kinds of experiences are you creating for people. Can I pick up any game and play a person that looks like me? Can I play somebody that looks like Karina? Can I play somebody that looks like you, Erica? And mostly the answer is no. And often when we do have more diverse narratives, they’re very stereotypical. Women are still highly sexualized, and Black and brown characters in particular fall into a few categories — Black and brown men are drug dealers, Black and brown women, they’re usually the help or in a domesticated role. We also see a lot of people of color taking the role of the sidekick, where they’re helping the main character along in the game but hold no significance of their own, or the comedic relief character. These are trends that we see in film and cinema, and they trickle down inside the gaming space as well.

And we’re talking about platforms that can have massive, incredible spaces that are diverse and inclusive, and reflect the beauty of the world around us. And yet, we have such narrow guard rails. In a world where anything is possible, why do we have the same old thing? And oftentimes this isn’t necessarily on the developer; they want to create these beautiful, intricate worlds and give these beautiful experiences, but they also exist and reside inside of capitalist structures where the bottom line is important.

A game called Forspoken recently came out, and it’s a game that has a Black woman as the lead character, and it flopped. It didn’t have any kind of commercial success, but they use that one game as the reason for why we can’t have a Black or a brown woman protagonist. But, how many games do we know of with a white, cis, heterosexual man that have flopped, yet you’re still investing, you know, in that model, right? We would also have to ask what marketing resources were put into this game, and who was on the team that created this experience. In the film industry there are examples like Crazy Rich Asians and Black Panther, where you have entire casts of Asian and Black people that have commercial success. I think people are looking for that inside the gaming space, and that’s the metric that gets used for determining what games are made.

Fitzgerald: You spoke at the National Humanities Center on responsible AI and I’d like to bridge the conversation on digital spaces and the question — which you have already hinted at — who are we building the world for?

Gray: Absolutely — and I would frame the question more as, “What could we create? What kind of world would we create if anything was possible?” We could build these vast worlds, yet we continue to create the physical manifestation of the real world. And this means the treatment of women, of immigration populations, and so on are reinforced.

One of the things that I wanted to try to drive home at the National Humanities Center is that gaming actually has become the canary in the coal mine for us. If we pay attention to what is happening in the gaming space, I think it’s like a sign of times to come. For instance, one of the first examples that I used was the Kinect, which basically was a camera connected to your Xbox that allowed your body to become the controller. It was a great technology for accessibility, because people who use sign language or maybe have chronic pain wouldn’t have to hold a controller in their hands. But, the problem that became very apparent early on was that the Kinect could not read Black bodies. If you have darker skin, it would not render your body into the gaming experience.

This happened back in 2010; and then, in 2020, we start seeing reports about Project Greenlight, which basically is a surveillance technology system where cameras can be connected to a police database, and utilize facial-recognition technology to track potential criminals. However, the system cannot tell Black and brown people apart; it can’t tell Jerome from Jamal, or Muhammad from Abdul. When we’re thinking about the ethical practices of these technologies, the extent to which technologies are used, not only is it a violation of privacy, but it is violating data privacy and then getting it wrong.

Gamergate is another example I would use, because it basically created the template and strategies on how to have a successful hate and harassment campaign; the use of social media platforms, bots, and misinformation was the precursor to the 2016 election.

If we pay attention to what is happening in gaming spaces, it often gives us foresight into what is happening in the tech spaces.

--

--

Lincoln Center for Applied Ethics
Lincoln Center for Applied Ethics

Written by Lincoln Center for Applied Ethics

This Medium site is an archived repository containing stories from ASU's Lincoln Center for Applied Ethics, including the blog series, "On Humane Tech."

No responses yet