How we Manage Toxicity for Social Apps and Websites

Community Sift
Community Sift
Published in
3 min readFeb 15, 2016

At Community Sift, we believe that the social internet is a positive place with unlimited potential. We also believe that bullying and toxicity are causing harm to real people and causing irreparable damage to social products.

We work with leading game studios and social platforms to find and manage toxic behaviours in their communities. We do this in real-time, and process over 1 billion messages a month.

Some interesting facts about toxicity in online communities:

  • According to the Fiksu Index, the cost of acquiring a loyal user is now $4.23, making user acquisition one of the biggest costs to a game.
  • Player Behavior in Online Games research published by Riot Games indicates that “players are 320% more likely to quit, the more toxicity they experience”

Toxicity hurts everyone.

  • An estimated 1% of a new community is toxic. If that is ignored, the best community members leave and toxicity can grow as high as 20%.
  • If a studio spends $1 million launching its game and a handful of toxic users send destructive messages, their investment is at risk.
  • Addressing the problem early will model what the community is for, and what is expected for future members, thus reducing future costs.
  • Behaviour does change. That’s why we’ve created responsive tools that adapt to changing trends and user behaviours. We believe that people are coachable, and have built our technology with this assumption.
  • Even existing communities see an immediate drop in toxicity with the addition of strong tools.

Here’s a little bit about what the Community Sift tools can do to help:

  • More than a Filter: Unlike products that only look for profanity, we have over 1 million human-validated rules and multiple AI systems to seek out bullying, toxicity, racism, fraud, and more.
  • Emphasis on Reputation: Every user has a bad day. The real problem is users who are consistently damaging the community.
  • Reusable Common Sense: Instead of simple reg-ex or black/whitelist, we measure the severity on a spectrum on a scale, from extreme good to extreme bad. You can use the same rules but a different permission level for group chat vs. private chat and for one game vs. another.
  • Industry Veterans: Our team has made games with over 200 million users and managed a wide variety of communities across multiple languages. We are live and battle-tested on top titles, processing over 1 billion messages a month.

To install Community Sift, you have your backend servers make one simple API call for each message, and we handle all the complexity in our cloud.

When toxic users are found, we can:

  • #### out the negative parts of a message
  • Educate the user
  • Reward positive users who are consistently helping others
  • Automatically trigger a temporary mute for regular offenders
  • Escalate for internal review when certain conditions like “past history of toxicity” are met
  • Group toxic users on a server together to help protect new users
  • Provide daily stats, BI reports and analytics

We’d love to show you how we can help protect your social product. Feel free to book a demo with us anytime at communitysift.com.

Originally published at communitysift.com.

--

--

Community Sift
Community Sift

We believe that everyone should have the power to share without fear of harassment or abuse. Our platform helps make that possible for social products.