Daniel Imbellino
strategic-social-news-wire
9 min readSep 22, 2016

--

Youtube Enlists Users as Moderators With Heroes Program

Without a doubt, Google’s social video sharing platform Youtube, has long been fighting a battle to control rampant harassment among its users. With a user base that’s now greatly eclipsed the 1.3 billion mark, Google realized they had to find a way to better handle abusers of its now more complicated than ever to manage video platform. The answer? Create a system of moderation among the Youtube platform’s most trusted users through the launch of its new Heroes program! While putting a stronger emphasis on moderation may be the key to Youtube’s harassment woes, the newly minted program leaves as many questions as it does answers regarding how effective it will be, and whether or not it may open the doors to more abuse.

For starters, after reviewing the documentation surrounding Youtube’s newly minted Heroes initiative, its easy to see Google is clearly building off the successes of its widely and heavily self moderated social platform Google Plus (For which we’ll touch on more in a minute).

While the program may be the answer Google is looking for to combat harassment, the bulk of Youtube’s user base isn’t so sure, and many are concerned the program could open the doors to more abuse, or worse, destroy the platform so many have come to love.

The video below had a whopping 210,000+ dislikes, as compared to less than 3,500 likes. The actions of users themselves make their wariness more than apparent, and the community reacted so negatively to the announcement that Google was forced to turn off the comment section of the video itself.

The new program works like this, Youtube will enlist well established users with preexisting channels to police its platform by handing them the power to flag abusive content using the trusted flagger program, AKA Youtube Deputy Program.

Youtube Heroes will also be able to unlock additional abilities and rewards for flagging abusive content and users, adding captions and subtitles to videos, and helping fellow users in the Youtube help community. Users will also help to moderate content from an exclusive Youtube Heroes community not available to the public. My assumption is that Google is going to use this community much the same way as it does the spam folders in G+ communities, creating a central point of moderation among its supposed Heroes.

The more contributions Heroes make, the more features they can unlock.

The program runs on 5 tiers:

Level 1. Join the community and get started.

Level 2. Users can learn with exclusive workshops and take part in specially formed Hero Hangouts.

Level 3. Users will gain the ability to mass flag content, and gain access to the Heroes Community where they’ll be able to moderate content collectively.

Level 4. Users gain the ability to contact Youtube staff directly, and unlock access to sneak previews of new product launches.

Level 5. Users can test product releases and apply for the Heroes Summit.

The requirements state users must have an established channel on the platform, and the program is only available to users, and not brands, organizations, or businesses.

While opening the doors to wider moderation can definitely help to keep the Youtube community clean of abuse from harassment, spammers, and objectionable content, the part that scares me the most is the fact it also opens the doors to possible abuse by those Google decides to put trust in to make it work.

Unlike with Google+, those on the Youtube platform don’t have the years of moderating experience that many G+ members do. The other concern is rampant censorship, an issue the bulk of the G+ platform has luckily managed to avoid.

As stated in the beginning of the article, the Heroes program appears to have striking similarities to the way Google handles moderation of its Google Plus platform. On G+, the network is heavily moderated by the platform’s large communities base, in which Google has a system that automatically flags any questionable content for moderator review, in effect removing that content from community flow and placing it in a spam folder until a moderator decides what course of action to take.

However, and contrary to Google’s statements that it plans to have real Google staff review all content flagged by its Heroes, and they’ve long claimed they do the same on Google Plus, I know from experience in dealing with our own communities that is not always the case.

In the case of G+, it appears Google often “White Lists” certain communities and their moderators when it comes to the reporting of abuse on the platform. Meaning, Google takes their word for it when a user or content is reported.

The question is, do they plan to do the same with Heroes who’re accepted into the Youtube Deputy Program, and will there be a system of accountability for those who attempt to abuse the program through the implementation of heavy handed censorship. The last thing we need is a Youtube platform where simple profanity and joking among friends becomes a cause for suspension.

with the Google+ platform, if Google has already flagged a post for review in our own communities, and we decide to report it, I can assure you, its game over for them! I’ve watched my moderators report users and their accounts were suspended instantly, in a second flat! There is no human review by Google staff as they claimed, and its obvious why.

The problem is, Google doesn’t have the staff to review millions of flagged posts each day from its massive Communities base. It’s just not humanly possible. On the other hand, reporting a user or post doesn’t necessarily instigate instant account suspension, unless the user or their posts were already flagged first by Google themselves (They ended up in the spam folder for moderator review).

So there is a system of safeguards in place, although it remains an automated one 99% of the time. If a moderator flagged a person or post, again, this is often not enough to warrant Google to take action on the user in question. Usually, they must have already done something wrong that caused their account to be previously flagged for abuse before hand. To some degree this does weed out some potential for abuse, but that doesn’t mean the system can’t be used to treat users unfairly anyway.

For instance, on Google+, if a community user were to use profanity and a trusted moderator reports it, instant suspension for 2 weeks! Again, I know this from experience, especially given we’ve banned, reported, and removed over 100,000 users over the course of three and half years.

As for Google’s claims of human review, you can just forget that altogether. Like they really reviewed the 450 posts that were flagged as spam in one of our community’s in just 1 day last week; for which I removed and rejected 90% of them, banning and reporting over 100 of them in one sitting for violating either Google’s guidelines or ours. I’m just not buying Google’s statements there.

I also know from experience that even if a user did nothing wrong, if a trusted moderator reports and bans a user, Google will respond by flagging that users future community posts for moderator review indefinitely. Especially if the account is brand new.

You see, Google doesn’t always know how to spot trouble makers, and they often rely on other users to report them for this reason. When it comes to G+, the whole system of moderation is built on trust between users, moderators, and Google themselves.

The point is this, Google puts a lot of trust in our hands! The really scary part is that, when it comes to what the platform deems to be harassment, its up to the moderators to decide this. Again, simple profanity can get your account suspended if a moderator chooses to report a user that Google is already suspicious of.

Again, while the system has long worked for G+, and I haven’t seen heavy censorship on the platform, how do we know the new moderation for Youtube wont take their new found powers too far?

For one, the moderators for our communities wont report, ban, or remove posts for simple profanity, as we strongly believe in freedom of speech and expression. We do report and remove users who threaten or harass others, and we also ban users and delete posts that appear to use excessive profanity, or content that portrays sexually suggestive themes (although warnings are given first).

The problem is, I know from spending time on platforms like Quora that some people clearly can’t differentiate between those expressing a strong conviction about something, joking about something, and or harassment. To them, its all one and the same. Which anyone who isn’t a complete idiot knows there’s a huge difference.

For instance, Quora moderation has a habit of threatening or banning users for simply getting into a disagreement with each other, or in many cases for expressing ideas the moderators themselves do not share. This type of heavy handed moderation is a cancer we’ve long worked to keep out of our Google Plus communities, and it will never be something I or my media organization will ever tolerate.

What we need to realize is that, when it comes to social media, people aren’t always going to see eye to eye, and if people want to argue their points, they should have the right to do so.

The real problem lays in the fact humans are tasked with being the judge, and we can only hope that Google will perform its due diligence and vet these moderators properly with the use of human reviewers before they decide to let them run free in the wild. While I and others on the G+ platform are intelligent enough to recognize the difference between strong convictions and flat out harassment, this doesn’t necessarily mean those Google affords the power to swing the ban hammer on Youtube will do the same.

Will The Benefits Outweigh the Drawbacks?

I honestly think they will. Google just needs to get the right people involved who intend to do what’s best for the Youtube platform and its users, including acknowledging users rights to freedom of speech and expression, my biggest concern. The last thing we need is for Youtube to turn into another Quora. I would absolutely flip out.

Conclusion:

I’m all for Google’s attempts to better police the Youtube platform in a manner that respects the rights of everyone involved (this means without unfairly censoring the public’s ideas or opinions). While many on Youtube have expressed their concerns regarding this new program, I say we should all embrace it as a way forward in improving the social experience that Youtube offers. After all, harassment on the platform has clearly gotten out of hand, and people have a right to be treated with dignity and respect, without the unjustified threat of being harassed for no reason.

I also believe there needs to be a system of accountability, just as we’ve seen with Google Plus, to help ensure users aren’t being unfairly reprimanded. On G+, no one is above the law, and moderators, community owners, and everyone involved is expected to uphold Google’s guidelines. In the case of our communities, we expect our moderators to set the example for others, and Youtube’s moderation should be held to the same high bar.

And yes, we’ve had cases where moderators abused their power, and yes we took action to remove them. This is why their needs to be a system of checks and balances to ensure no one is operating with impunity.

As a final note to the Youtube community, I am just as concerned as you are. But be rest assured, if Google runs afoul of its originally intended mission, they will be held accountable. If there’s one thing we all can’t stand, its censorship, and in the long run we must all work together to ensure all those involved are treated in a fair and respectful manner, and anyone who intends to abuse their position of power is held to account for it.

Written and published by Daniel Imbellino — Co-Founder of Strategic Social Networking and pctechauthority.com. Many thanks for reading. Be sure to check out Strategic Social Networking Community on Google+ to connect with tens of thousands of IT professionals and learn effective strategies to grow your social presence online. You’re also welcome to follow Strategic’s brand page on G+ for the latest social media and IT industry news.

Additional Resources:

More on the Youtube Heroes Program

--

--

Daniel Imbellino
strategic-social-news-wire

Information Technology Specialist — Co-Founder of Strategic Social Networking and www.pctechauthority.com