Tackling Toxicity in Online Gaming Communities

Community Sift
Community Sift
Published in
3 min readApr 1, 2016

The gaming industry is making a breakthrough.

For most of its history, internet gaming has been one big free-for-all; a lawless landscape that all too often failed those users seeking reprieve from the pervasive theme of hostility in anonymous environments. A sustained lack of maintenance to any system results in faults, so it should come as no surprise that many industry leaders are finally ready to stop ignoring the issue and embrace innovative approaches.

As product and game designers, we create social experiences to enrich people’s lives. We believe that social connections can have a profound transformational effect on humanity by giving people the ability to connect with anyone from anywhere. When we take a look around at the most popular web products to date — social media, social games, instant messaging — the greatest common denominator becomes apparent: each other. The online world now offers us a whole new way of coming together.

There is, however, a problem created when the social environment we are used to operating within is paired down to bare language alone. In the physical world, social conventions and body language guide us through everyday human interaction. Much of our communication happens non-verbally, offering our brains a wider range of data to interpret. Our reactions to potentially misleading messages follow a similar pattern of logic, primarily driven by the rich database of the unconscious mind.

Online, these cues disappear, placing developers who wish to discourage toxic discourse in an awkward position. Should we act quickly and risk misinterpretation, or give users the benefit of the doubt until a moderator can take a closer look? The second option comes with the equally unsavoury proposition of leaving abusive speech unattended for hours at a time, by which point others will have already seen it. With reports showing that users who experience toxicity in an online community are 320% more likely to quit, developers concerned with user retention can no longer afford to look the other way. So what are our options?

Methods for tackling community management generally fall into one of two categories: penalty or reward. Typical responses to bad behaviour include warning messages, partial restrictions from game features and, as a final measure, temporary or permanent bans. On the flipside, rewards for exemplary behaviour seem to offer more room for creativity. Massive online battle area game Defense of the Ancients has a commendation system whereby users can give out up to 6 commendations per week, based on four options: Friendly, Forgiving, Teaching, or Leadership. Commendable users receive no other tangible reward beyond of prestige.

“Personally, [DotA’s commendation system] always incentivized me to try and be helpful in future games simply because leaving a game and feeling like you had a positive impact despite losing feels way better than raging at people and having them threaten to report you,” explains one Reddit user in a discussion thread centering around commendations in online games.

Another notable example is League of Legends’ recent move to give exclusive skins to users with no history of bans in the last year. A Pavlovian model of positive-reinforcement seems to be gaining fast traction in the gaming industry.

Still, a complex problem requires a complex solution, and toxicity continues to persist in both these communities. With all the work that goes into creating a successful game, few studios have the time or resources left over to build, perfect, and localize intricate systems of penalty and reward.

The first step is acknowledging two inconvenient truths: context is everything, and our words exist in shades of gray. Even foul language can play a positive role in a community depending on the context. An online world for kids has different needs from a social network for adults, so there’s no one-size-fits-all solution.

Competing with the ever-expanding database of the human mind is no easy task, and when it comes to distinguishing between subtle shifts in tone and meaning, machines have historically fallen short. The nuances of human communication make the supervision of online communities a notoriously difficult process to automate. Of course, with greater scale comes a greater need for automation — so what’s a Product Manager to do?

--

--

Community Sift
Community Sift

We believe that everyone should have the power to share without fear of harassment or abuse. Our platform helps make that possible for social products.