Your Ally in Community Management

Renee Gittins
Spirit AI
Published in
6 min readMar 1, 2019

Encountering toxic speech and behavior online is not an uncommon experience. Nearly half of Americans have experienced harassment online and 62% see it as a major problem. The number of people affected by this toxicity is increasing as connectivity and online communities grow.

As a gamer and game developer, I have the utmost respect for those who wrangle the twisting mass of online communities. Without proper community management, a game or platform can quickly earn a bad reputation, tarnishing the public perception of the developer and project.

It is extremely important to me that online communities are approachable and welcoming for all. After recognizing the absence of community management in many virtual reality communities, I wrote a white paper about the best practices for developers to empower and protect their users from harassment. Providing tools for developers to assist their communities is one way that I can help improve the experience of many.

Still, community moderation is a tricky task; the moderator must have deep understanding of the context of the environment and the context of relations between users to properly identify “bad” behaviors and to identify benign behavior between friends. If your moderation system does not identify harassment and abuse because the abuse itself does not trigger a few core words or phrases, then countless cases will never be flagged to moderators. Encountering toxic behavior causes players to stop participating and leave the community.

When you are handling a community of millions, how can you quickly identify harassment within proper context? We often rely on community self-moderation in the form of reporting to identify subtle harassment, but many players, especially new players, are quick to leave without letting us know in the face of abuse.

I would like to introduce an Ally to help.

Spirit AI’s Ally is a community management tool that provides new ways to quickly assess your community and its members — from harassment to community sentiment. Ally uses cutting edge artificial intelligence, but, most importantly, Ally understands your community and their interactions in context, and streamlines the work of your community management team.

Contextual Understanding

Ally monitors community interactions contextually. It studies not only the intent of the words spoken by users, but their actions and the reactions of those with whom they are communicating.

For example, there is a big difference between a close friend saying “Hey pretty lady!” and a stranger saying “Hey pretty lady!”. Unlike simple profanity detection, Ally will takes the context of the situation into account.

Deeper Context

While context of conversations are very important to understanding interactions between users, so are their actions. Ally integrates into a platform or game to not only monitor language between users, but their actions and reactions. It can monitor if players are in the same guild, if a person leaves the room after an unpleasant comment, and if someone reacts negatively to a potentially offensive comment.

This additional contextual understanding has identified users who harass large numbers of other users with overly flirtatious greetings. These greetings caused their targets to leave the room or chat they were in, with many victims never returning to the host platform again.

These serial harassers often go undetected, chasing countless others away every day they are on the platform. Others reach out to a wide set of users to lead into financial grooming conversations. With Ally’s understanding the context of these relations and interactions, these harassers are identified and can be handled appropriately by the moderation team.

A Positive Note

Not only does Ally help identify negative interactions and reactions, but it looks at positive interactions as well; assisting not only with community moderation, but overall community management.

Ally can identify the praise of players towards features and changes, highlighting the general sentiment the community has towards a feature or change. It can also detect praise players place on each other, allowing you to monitor and reward positive behavior.

Streamlined Management

Ally was built to empower community managers and moderators, enabling them to spend more time on building and improving their community. Ally helps your community team work effectively and with reduced mental and emotional costs by streamlining the moderation process. It gives you more time and energy to focus on maximizing the good of your community.

Bad Actors

Community members are truly the subject of moderation, not individual events. Ally creates reports based on users themselves, termed “actors”. These actors have their own histories, created from each story of events that they participated in and any actions that were taken against them. You can review the timeline of an actor’s behavior before making a judgement based on their most current interactions.

Each actor in the report also has their own randomly generated name. This generated name reduces bias, but remains consistent so that moderators can recognize and discuss patterns of activity from a particular user more easily.

Moderation Integration

Ally can hook directly into your own moderation systems, allowing you to institute disciplinary actions, review messages, and track behavior patterns all from a single interface. No more juggling between tickets and database interfaces!

Categorization and Highlights

When Ally reviews messages, it detects several different categories of abuse, harassment, and other actions, though this can be expanded to watch for other areas of trouble as well.

The base categories that Ally monitors are:

  • Harassment and bullying
  • Hate speech
  • Grooming
  • Spam bots
  • Self-harm threats
  • Community sentiment
  • Unsolicited content

Each incident that falls into one of these categories is logged and raised by Ally for community moderators to review, with the relevant encounters and actions highlighted. New features and categories are being added as those needs are identified, ensuring that Ally is always tracking the biggest community concerns.

Emotional Labor

A piece of advice I commonly hear is “Don’t look at the comments.” Toxicity, even when not aimed directly at you, can quickly wear you down. However, when it’s your job to review the comments for this toxicity, such content can be hard to avoid, and it takes its toll. There have even been accounts of reviewing such content causing PTSD.

Ally creates stories of each incident that can be reviewed at a high level before or even without diving into the offensive content itself. While Ally is not a replacement for community management, it can reduce the amount of offensive content community moderators have to view directly, allowing them to spot and handle the most egregious offenders on their stories alone, then delve down to use personal judgement for more nuanced cases.

Your Ally

We here at Spirit AI are impassioned to help communities thrive. We want to empower community managers and moderators to have the best effect on their communities they can.

If you have any recommendations on how to make Ally a more effective tool or want to discuss Ally with us, please reach out! We’re always available at hello@spiritai.com.

Some of our team (myself included) will be at GDC as well! So let me know if you would like to talk more about how we can make online communities safe and accessible for everyone.

--

--

Renee Gittins
Spirit AI

Renee is a Solutions Architect at Spirit AI, and passionate advocate and connector for developers and diversity in the game industry.