The State of Platform Trust & Safety Standards

Carrie Melissa Jones
Community Building and Strategy
8 min readSep 13, 2018

By 2018, the world’s largest technology platforms faced scrutiny over their trust and safety policies. It’s no surprise: in the last few years, these platforms have been home to widespread misinformation campaigns, hate speech, data breaches, and beyond.

This 2018 alone, Mark Zuckerberg stood trial before US Senators to discuss Facebook’s data privacy and safety policies, Twitter still has a Nazi problem, and Twitch proactively revamped their Community Guidelines.

Platforms of any size require a complicated mix of human and automated moderation, escalation processes, and enforcement partnerships, and they must be created proactively. Any platform focused on bringing people together must pay close attention to the state of trust and safety today.

So what do the trust and safety standards of major platforms currently have in common?

I’ve been seeking to understand this question deeply over the last four months. Earlier this year, alongside a platform client with over 100M monthly active users that is revamping its policies, I started digging into dozens of tech platforms’ Trust & Safety (T&S) policies. I sifted through and analyzed the public policies of 15 companies: Airbnb, Amazon, DIY.org, Etsy, Facebook, Flickr, Instagram, Kickstarter, Lyft, OfferUp, Quora, Rover, StackExchange, Twitch, and Yahoo! Answers. My criteria were that the platforms needed to exist for more than six years (so they were mature enough to learn some hard lessons); have at least 500,000 users; contain either a set of Community Guidelines or clearly labeled “Trust & Safety” policies. Together, I wanted to ensure that I looked at platforms that served a diverse array of ages (from age 6 upwards), industries, and use cases.

Once I’d finished analyzing the common components and approaches of these platforms’ policies, I started to see clear patterns for today’s T&S policies. I wanted to open-source this knowledge for others for three reasons:

  1. Give community builders a proven framework for their own policies.
  2. Allow anyone to see how platform companies are presenting their T&S policies as a whole.
  3. Encourage anyone to rally for more proactive trust-building and safety policies by the platforms that define so much of modern online life.

Key Terms

Before we dive into the nature of today’s largest platforms’ T&S policies, let’s define the key components and terms. The most common building blocks of today’s Trust & Safety policies are missions or purpose statements, values, guiding principles, guidelines, and rules.

  • Mission or Purpose: The formal statement of the goals or aims of a platform.
  • Values: The standards of behavior that guide all actions on the platform and decisions made by the platform company. These can also be called Guiding Principles, which tend to illuminate further how values should be interpreted on the platform.
  • Community Guidelines: The norms that define proper conduct on a platform.
  • Rules: A listing of what is not allowed on a platform (unique from guidelines, these are typically written in the negative, rather than the affirmative to ensure clear guardrails are in place for moderation)

Structure of Standards Overall

Today, most platforms include all of the above components, in this general structure:

  1. Overview of Mission or Purpose of Platform: I’ve discussed this concept in depth before and won’t delve in further here.
  2. Values or Ideal Behaviors: These form the foundation of the rest of the standards.
  3. Some Combination of Community Guidelines and Rules: The best practice for how to write these seems to be the least agreed-upon component of T&S standards today. Sometimes T&S Standards are limited to standalone Community Guidelines, but often, Community Guidelines are just this one single part of more complex T&S policies.
  4. Resources and Tools: Some transparency into how the platform keeps its users safe.
Airbnb provides a useful blueprint for best practices at https://www.airbnb.com/trust/standards.

Values or Ideal Behaviors

Today, most platforms include overarching values of their communities, which form the backbone of all the other decisions they make about what they allow and what they do not allow on their platforms.

When these platforms do include explicit values or behaviors, they to limit them to between four and seven values or ideal behaviors, with an average of 4.8.

Yahoo! Answers includes four ideal behaviors inside their Community Guidelines

To use Airbnb as an example again: Airbnb’s values are safety, security, fairness, authenticity, and reliability. They nest their rules inside of these values, rather than separating them out. Here is their “safety” value and the corresponding rules that fall under upholding this value:

Airbnb’s value of safety then includes corresponding descriptions and guidelines

These are best in class examples above. Platforms like Instagram and Amazon nestle their values inside of other rules, in a way that covers their bases but doesn’t have a clear organizational structure.

Amazon’s Community Guidelines, which mix ideal behaviors with explicit rules in one text-only page. There is a less clear structure for the casual reader, but then very few people read these policies casually.

Combination of Community Guidelines & Rules

Many (but not most) of today’s large platforms have Community Guidelines written in the affirmative, nested inside of larger Trust & Safety policies. Many two-sided marketplaces split these Guidelines into two separate sets of standards, one for the supply-side and one for the demand-side.

Thus, a trend emerges for large two-sided platforms — either create one set of standards for a community where crossover of member needs from supply into demand is high or create two separate sets of standards if there is little crossover in your membership. The decision about which way to go will depend on the dynamics of your particular platform.

Two-Sided Example: Etsy

Etsy separates out their rules by buyers and sellers.

Etsy has separate “house rules” for buyers and sellers, and they separate out these house roles from their trust page, which details what Etsy does for both sides of the market: https://www.etsy.com/trust.

Comprehensive Example: DIY.org

DIY.org’s purpose is to gather young makers together, and there is no two-sided marketplace inherent in this community. One complete set of policies suffices.

DIY.org’s guidelines and safety standards, reinforced by their community team. A detailed write-up exists about their policies is here: https://medium.com/berkman-klein-center/building-a-safe-digital-space-for-young-makers-and-learners-the-case-of-diy-org-7c7457b603e9. (And thank you to Becky Margraf, formerly of their team, who helped me in understanding their T&S policies for younger users!)

Rules: A Necessary Nuisance

Most platforms today include rules as well as overarching guidelines, or instead of guidelines. As a community builder, my preference has always been toward guidelines, supplementing with clear rules where necessary. Yet as annoying as it may be to write and read formal rules on a platform, they’re essential because they provide clarity for users and moderators.

Almost all platforms have at least some section that includes rules for what is NOT allowed (see Etsy, above). These rules are sometimes cascaded into help documentation, as is the case of YouTube’s documentation or Quizlet’s Guidelines as a whole. They can also sometimes be included in Terms of Service rather than elsewhere, though this is not a best practice (though it’s not a bad idea to add them in TOS in addition to other locations).

Most Common Rules

I drilled down into all the rules of these 15 platforms, and found that the most common included rules to regulate the following types of actions:

  • Copyright or intellectual property violations (most common of all)
  • Illegal activity
  • Any harassment or hateful conduct, including discrimination, hate speech, bullying, or targeted attacks
  • Impersonating other users (less of an overt problem on pseudonymous platforms, but common due to the number of real-identity platforms included in my analysis)
  • Spam and scams
  • Sexual content or nudity
  • Any violent conduct: extreme violence, graphic violence, threats, gore, obscenities
  • Harming others or acting maliciously
  • Sharing of private information (a.k.a doxing)

Tools & Resources

Most platforms are not public about the escalation processes that they use internally to maintain strong trust and safety standards. However, they may provide some general information about how they make decisions. That has become more popular for large platforms like Facebook, which exposed its guidelines this year after they were leaked and criticized.

This is also the case for YouTube:

YouTube explains what happens when your content is moderated

Also, some platforms also expose the resources and tools that they refer to in their work — again, the example of YouTube is helpful here:

Making these resources public provides useful help for all, both users and non-users of the platform. Instagram does this as well.

In addition, while most internal escalation processes are not public, Taringa! platform founder Gino Cingolani shared theirs, which provides a useful example of a moderation process at work:

From Gino Cingolani of Taringa!

If starting a community from scratch, keep in mind that you will need to create these escalation processes sooner rather than later in your journey, and you may also be helped by planning for automation and computer-assisted moderation processes.

We Can Do Better

Trust & Safety Standards are becoming more robust and well-defined, but platform creators, trust and safety specialists and community builders need to take what already exists and build it to be even safer, more inclusive, and responsive to conflict. Data science and automated moderation can aid us in our work, but it will not solve everything. As I’ve advocated before, the road to better trust and safety is not in overarching standards, but in fostering smaller communities.

And, of course, our work isn’t done by copy-catting the work of others, but rather by looking at the needs of our community members, designing for the inclusion of all who share our community’s values (and ensuring our leadership reflects that inclusion), and fighting to make digital communities safer, more transparent, and more accessible to all.

Further Reading and Recommendations:

  1. For details on how platforms moderation teams make decisions and the political implications of these decisions, check out Tarleton Gillespie’s Custodians of the Internet
  2. Make it human — include images of humans if possible, as Airbnb does
  3. National Suicide Prevention Lifeline is one of the most popular tools referred to in these T&S standards
  4. The Trevor Project specializes in suicide prevention for LGBT youth and offers a lifeline that people in the US can refer to (many other country-specific resources exist for marginalized communities as well — check out Instagram and YouTube’s T&S pages for a good starting point to learn more)
  5. Proactive community conflict management — a talk I gave at Facebook’s Communities Summit in 2017
  6. Law enforcement and local organizations: partnering with local law enforcement and other partner organizations is a best practice among platforms internally
  7. I didn’t even get into the politics of enforcing these rules and moderation (and the lack of consistency in how platforms interpret them). There is an entire field of academic discipline studying this right now. I highly recommend following the work of Casey Fiesler and her researchers at CU Boulder as well as Kat Lo at UC Irvine.

Want to push platforms to do more? It starts with you. What is one change you can make to your Trust & Safety policy at your organization to make your platform safer for members and proactive in its work to make the world a safer, more ethical place? What is one change you can advocate for on a platform where you spend your time online?

--

--

Carrie Melissa Jones
Community Building and Strategy

I research and write about the structures, problems, and positive impacts of online communities.