User-Generated Content: The Most Dangerous Threat To Your Company

Jessica Johnson
Companies for Social Change
4 min readFeb 28, 2014

--

There’s this little science and tech website, maybe you’ve heard of it —Popular Science. It’s the kind of site you can go to and read popular stories about gadgets, cars, DIY, and of course…science and technology.

It’s great for catching up on current events, reading about product reviews, and stimulating the mind.

They were committed to a website that encouraged thoughtful discussions from commenters.

But, they still decided to shut off comments on their site last year.

They stopped accepting comments because users were attacking other users, using profanity, spamming, and detracting from their really awesome content. They took the position that comments are not only bad for science, but bad for their brand.

That’s what you risk every time you outsource content for your business.

It might be guest posts for your blog. It might photos for your gallery. It might be videos that you allow readers to embed in your comments.

It’s called user-generated content, and it means your business model is based on crowdsourced labor from untrained, unaccountable, and anonymous workers.

And, it’s a recipe for disaster.

What’s user-generated content, anyway?

User-generated content (UGC) is term best defined by Margaret Rouse as “published information that an unpaid contributor has provided to a web site. The information might be a photo, video, blog or discussion forum post, poll response or comment made through a social media web site”.

Is UGC really that dangerous?

It is. And, for more than a few reasons.

Images and videos go viral quickly.

UGC is the backbone of social networking sites. Social networks can be described as any website that accepts user uploaded content in the form of images and/or videos.

Popular sites like Facebook and YouTube are making it easier and easier to upload, view, and share explicit messages.

Unfortunately, their efforts to protect victims of child pornography and online bullying aren’t keeping up.

And when they fail to moderate user uploaded videos & images, that content can go viral quickly.

People get hurt.

For a lot of celebrities, leaked photos online are embarrassing. When kids are involved, the consequences are often much more severe.

Take for instance, these recent headlines:

As you can see, there are many examples of the devastating consequences that can result from lackadaisical moderation.

There’s no denying that sexually explicit images & videos are traumatizing when they’re used to hurt, manipulate, or shame the innocent.

We are talking about a serious matter of legality, too. To repeat an earlier point, the distribution of explicit images & video containing minors is illegal and is considered child pornography.

It happens all the time!

One explanation, says criminal attorney Brenda Joy Berstein, stems from the fact that

“this is the generation that likes to post, express, put everything out there for everyone to see, and is enamored with video and photographs.”

More and more people are being victimized by online content each day.

Sure, those identified as responsible for criminal offenses are subject to prosecution under the full extent of the law. Just take a look at these two highly publicized cases:

Prosecution happens less than you’d expect, though. For one thing, Teen Sex, Videos And The Law is a sticky matter. To say that our legal system is overwhelmed is a serious understatement.

There’s no reversing the damage.

In each of the news examples shown earlier, we’re reminded that social networks are “very proactive about removing these things”, and we hear that “as soon as they get a complaint and they see that, it’s immediately removed”.

However, this approach is a far cry from effective moderation. The major problem with these types of offensive material and exploitive content is that if they go viral, there’s simply no way to keep it from resurfacing, to get the content back, or undo the damage once it’s done.

So, is UGC bad for your business?

Of course not. It’s unmoderated, or poorly moderated UGC that’s the problem.

To understand how to effectively moderate user images & videos, it’s helpful to know what types of moderation offer the most control.

Beyond that, it’s also helpful to understand the difference between pre-moderation and post-moderation.

Social networking sites like Facebook, YouTube, and Twitter rely on post-moderation — placing the responsibility on users to moderate offensive content after the fact. While this certainly makes their job easier, it allows for too much collateral damage.

Pre-moderation on the other hand, stops offensive content from going live in the first place. The obvious disadvantage to this method is that it takes away the instant gratification that so many of us are accustomed to. And, this method is labor intensive.

Images and text tend to be much easier to moderate through automation & web service APIs. Videos on the other hand, provide much more of a challenge to developers — many times, relying on live video moderators for optimal results. Sure, it is more resource intensive and requires more accountability.

These limitations mean that business owners are faced with a difficult decision. If you care about the type of content users can upload, you must rely on either good pre or post-moderation practices…or get rid of comments all together.

Just know that if YOU stop accepting comments, your competitors aren’t.

--

--

Jessica Johnson
Companies for Social Change

Project maven at Secret Weapon, where we help companies become more awesome on the internet.