#WhoseTube — 1 Year Later: Exposing YouTube’s role in the spread of disinformation and anti-Muslim bigotry
by Ramah Kudaimi, Crescendo Campaign Director
“As YouTube has become bigger and has had more of an impact, we’ve seen a need to increase what we’re doing from a responsibility standpoint,” CEO Susan Wokcicki said last April as she received an award from the Freedom Forum Institute.
She said this just a month after we had sent her a letter signed by over 20 organizations outlining how YouTube continues to platform anti-Muslim, white supremacist, and right wing content, content that three years ago inspired a shooter to enter two mosques in Christchurch, New Zealand and kill over 50 worshippers.
A year later, we have yet to get a response from YouTube and we know the problem continues.
Videos spreading disinformation and propaganda against Muslim communities across the globe are still posted and shared. And videos from activists documenting human rights violations and organizing against state violence have been removed or restricted. Those targeted include Uyghur, Palestinian, and Kashmiri people, all of whom continue to face state violence of through settler colonialism, mass surveillance, and genocide.
We know social media platforms don’t only have an impact online- research has found for example Facebook responsible for inciting violence against the Rohingya people. Our WhoseTube campaign is exposing the role YouTube is playing in allowing the spread of anti-Muslim bigotry and violence. As long as Islamophobes, white supremacists, and right wingers continue to be free to post, we need to question who this platform serves.
Last month Wojcicki laid out that it has been controversial to make decisions to remove videos and suspend accounts when there is content that would be deemed as harmful but yet is not illegal. But this isn’t a freedom of speech issue. This is about YouTube profiting from allowing users to spread Islamophobic rhetoric and disinformation against Muslim communities, oftentimes in service of policies and violence against Muslims.
We are demanding that YouTube put people over profits. Google, which owns YouTube, made more than $28 billion in revenue last year from YouTube ads alone. Google makes money every time someone stays on YouTube longer watching more and more videos. So just like YouTube keeps recommending for us to watch more videos based on our music interests or more shows based on our viewing history, YouTube keeps white supremacists engaged on its site longer by recommending they watch even more violent and hate filled videos. This business model depends on promoting right wing videos.
In her 2022 letter, Wojcicki wrote about protecting the YouTube community: “It’s our top priority to live up to our responsibility.” And yet what she outlines as solutions fall far short. The new metric they released, the Violative View Rate (VVR), doesn’t capture the extent of the problem. VVR tracks what percentage of views on YouTube comes from content that violates its policies. So in Q3 of 2021, YouTube’s VVR was 0.09 to 0.11 percent, which means that out of every 10,000 views on YouTube, 9 to 11 came from violative content.
That sounds like a minuscule number, but when you consider there are billions of views on YouTube every year, this in fact means millions of people are still watching content that violates YouTube’s own policies.
Additionally, who determines what is and is not appropriate content to post? For example YouTube claims to keep views of borderline content under 0.5 percent of views on YouTube. Again this still means millions and millions of views. But also who gets to decide what defines borderline content? Shouldn’t YouTube be transparent about this decision making process? And shouldn’t it be done in consultation with the communities impacted? We know the guidelines YouTube is depending on are not enough as individuals and organizations have accounts and are able to post videos even though their role in promoting rhetoric and policies oppressing Muslim communities is well documented. Our communities shouldn’t be beholden to greedy tech executives and algorithms deciding what does and doesn’t put us in danger.
The actions to remove Russia disinformation these past few weeks shows that YouTube understands the power of propaganda spreading on its platform and also proves they can take action immediately when needed. The question remains why they won’t do the same when it is Muslim and other Black and brown communities targeted.
We will continue asking #WhoseTube is it until YouTube finally responds to our demands.