Just how much responsibility should a social media platform bear for ensuring that news providers from both sides of the political spectrum can share (or shout) their views at equal volume? For Youtube, finding an answer to that question has been nothing short of a PR and policy nightmare.
In August, the video-sharing giant announced that they would begin hiring managers to work with commentators from both ends of the political spectrum. As representatives commented on the decision for the Verge, “We have experts for many of our content categories and are growing the partnerships team that works specifically with news creators — for both conservative and progressive news outlets.”
The turning point, it seemed, came when right-wing YouTuber Steven Crowder lambasted the platform for what he viewed as undue censorship. Crowder’s complaints began when moderators demonetized his channel after a journalist for Vox wrote an article spotlighting the far-right content creator’s use of homophobic language and hate speech. In response, Crowder accused Youtube and the progressive media of executing a coordinated media campaign to suppress independent conservative voices. Crowder’s ire quickly gained national traction when prominent right-wing figures such as Senator Ted Cruz and Ben Shapiro echoed his complaints.
And yet, conservatives aren’t the only ones with simmering resentment against the video-sharing platform. Progressive LGBTQ content creators have also spoken out against censorship on Youtube and cited instances of undue demonetization — sometimes for using words such as “transgender.” Their complaints intensified when Youtube explicitly stated that Crowder — who had been demonetized for his homophobic language — hadn’t technically violated the platform’s policies.
In this context, Youtube’s choice to provide support to both sides of the political spectrum serves as a metaphorical band-aid over the PR wounds caused by accusations of partisan censorship. These new content managers will both advise political creators on how best to develop their channels and organize programs and events to ensure that politically-focused users can utilize Youtube to its fullest potential.
This seems to be precisely the kind of equal-support measure needed to quell the current firestorm. But Youtube’s decision to bring in content managers to support political creators as they already do for gamers and beauty vloggers raises an important moral question that the other two specialties don’t face. Now that Youtube is stepping in to help develop news-adjacent content, will it have a hand in spreading misinformation, bias, or fake news as it supports — by the content creators’ own admission — partisan voices? Does Youtube have a responsibility to mind some defining line between political commentary masquerading as newsworthy fact and the news itself?
This question has become especially important in recent years as Youtube has gained popularity as a political platform. According to a 2016 Tubular Insights study, the video-streaming platform reaches more 18 to 49-year-olds in the United States during prime time than the top 10 TV shows combined. This rocketing viewership makes the platform fertile ground for sparking grassroots political movements.
As one journalist describes the shift onto Youtube in an article for the Verge; “Instead of spending a few minutes with Chuck Todd or Tucker Carlson on television, rattling off talking points, these free, easily accessible YouTube shows allow candidates to discuss policy in a relaxed setting with someone who feels like a friend to their audiences. That has made YouTube one of the more rewarding places on the internet to campaign.”
Democratic primary candidate Andrew Yang stands as a clear case study for Youtube’s ability to draw national attention and support. After appearing on Youtube-based personality Joe Rogan’s podcast, Yang saw an uptick in viewership the allowed him to qualify for major debates and springboarded him to interviews with The Daily Show and Esquire.
However, there needs to be a critical definition here. People like Joe Rogan interview political figures and provide commentary, but it’s often driven by opinion and conversation more so than strict, unbiased, research-based reporting. It’s interesting — so much so that tens of millions of people tune in — but not reporting in the traditional sense. Is Steven Crowder, one of the “conservative voices” that have been silenced, news? Are perspectives from hard-left Antifa commentators, news? What happens when viewers start taking personality-driven perspectives as fact?
The question isn’t a trite hypothetical. Print journalism is dying; its migration to digital is failing. Some 500 daily newspapers went out of business between 1970 and 2016. According to Reuter’s 2019 Digital News report, over 50% of readers in the United States now come across one or more paywall barriers each week when they attempt to read online news. Today, just 20% of readers get their news via news websites, while 43% have their first contact with headlines via social media or messaging apps.
So, if Youtube has the potential to become a significant source of news for future viewers — and given the above, that’s a real possibility — what responsibility does the platform have to stem the tide of biased or even “fake” headlines?
At this point, it seems fair to turn to Facebook.
In recent years, the social media giant has fought a pitched battle against the spread of fake or biased news. This summer, Facebook announced that it would be hiring a small team of journalists to curate news stories for its unreleased News Tab. Representatives specified that while the majority of the headlines listed in the News Tab would be generated via an algorithm, the top-ranked stories for the day would be selected by experienced journalists. The decision to bring on journalists comes after a slew of public outcry that its now-retired Trending Topics section gave old, outdated, fake, or misleading stories millions of daily views.
By bringing on journalists, Facebook is attempting to create a screening board for the news stories of the day. It’s a way to rebuild consumer trust as we shift to a social media-dominant news landscape — a necessary pursuit, given that less than half of news-readers globally agree that they trust the new media they themselves use.
Youtube’s decision to hire managers and give those on either side of the political spectrum a metaphorical megaphone could be interpreted as doing the opposite. Should Youtube, like Facebook, be curating top-ranking news content for its viewers or warn viewers of content that, while entertaining, could be biased, incomplete, or based on “fake” news?
Is providing “equal” support for extreme commentators the answer here, or is it a distraction from the real problem — how the platform should be organizing or moderating commentary-as-news online?
Here’s the issue. Youtube is, fundamentally, an entertainment-based site — and if its firestorm-sparking struggles with Crowder indicate anything, it would be that any attempt to step in and moderate content will be met with a PR disaster. Providing content managers is an easier way to calm the tides, even if it doesn’t address the underlying problem of commentary-as-news-truth on the platform.
Our reliance on — and, ironically, distrust of — social-media-provided news will undoubtedly grow in the future. However, unless public outcry against the spread of fake news manages to outshout the furor against content moderation, it seems unlikely that Youtube will step in to quell the potential spread of bias and misinformation.
We may be all the worse for it.
Bennat Berger is an entrepreneur, investor, and tech writer based in New York City. He is a co-founder and Principal at Novel Property Ventures, a real estate firm that specializes in amassing and managing multifamily residential units in New York City. He is also a founding partner at the investment firm Novel Private Equity, where he oversees investments across a diverse range of interests, from experiential retail to entertainment to supermarket technologies.