Moderation: creative deletion
Having established, just two days ago, for pity’s sake, that Mondays were for Social Media (amongst other work stuff) and that Wednesdays were going to be about politics, Something Interesting Has Happened. You are thus saved (for now) from my swivel-eyed socialism, in favour of some gushing praise towards my former employer.
No, not that one.
The British newspaper, the Guardian, has published an amazing look under the covers of online commenting and found this. Basically, they have researched the 70 million comments that have appeared on their website’s articles, including the 1.2 million their moderators deleted, and found that… well, all this engagement stuff comes with a price.
I’ll quickly add a mea culpa here. Some small proportion of those 1.2 million rejected bon mots were down to me. I mean I rejected them when I worked at the Guardian in 2007. I didn’t write them all. What kind of monster do you think I am?
I have also rejected comments on other newsites. And on non-news websites. And on Facebook. I am a moderator, and that is what we moderators do.
Except it isn’t. Except where it is.
From the moment that some bright social media savvy bod starts talking about looking for engagement with their customers/audience/fans, the sharks start circling. Haters gotta hate. Trolls will troll. Misogyny, racism, porn, personal abuse, grooming… they are all shuffling into position, waiting to pollute the very idea that we can ever allow people to talk.
“But not on my page”, I hear the cry. “I just want to talk with people who share my love of pizza!”
There was once a campaign, just a nice touchy-feely-eaty send-us-pictures-of-your-pizza piece of marketing fluff. Only it didn’t just attract pictures of pizza, if you hear what I’m sayin’. Two other five-letter words beginning with “p” come to mind.
None of the offending images were published, of course, because my other employers — this time I do mean them — were on the case, and the project was safely pre-moderated. Kudos to the client for being cautious. Not all of them are.
But moderation isn’t just this defensive, block the spam, block the porn, block the f-word stuff. It is also an incredibly enabling agency, that is a powerful part of making this fantastically open era of citizen journalism, of activism, of discourse, a working reality. It’s like… the difference between an unedited first draft and a final pubished glorious thing. Beauty and clarity and truth get added… as stuff gets taken away.
Read the Guardian article, and especially where they talk about their rules of engagement, their T&Cs and community standards, and you’ll see that as well as protecting their brand and their users from the 1.2 in 70 comments that just don’t belong, the goal of the Guardian moderator is to add value. They keep conversations on topic, they prevent the loud voices from crushing all other opinion, they use sensitive, human appreciation of the conversation to help it breathe.
Truly, a huge pat on the back to my former employers for this insight. I’ll be back in touch about my consultancy fee.
To anyone else out there, especially, perhaps, those who don’t think their project needs such care, or view it as censorship… think about what a huge commitment this organisation, and others like it, have made, paying for all those 1.2 million presses of the delete key. People like me don’t come cheap. OK, we do. I do.
Yes, they prevented us all from seeing some ugly opinions, and that’s the very obvious point of comment/user moderation, even if some user of Reddit is typing as we speak about how we don’t need to be protected from the vile.
That’s a conversation we can have, but those Guardian moderators also made it so that the other 68.8 million comments were able to say “come in, it’s cool; did you see something you were interested in? Do you have something to say?”
Worth every penny.