YouTube’s recommendation engine for news needs a lot of work — yesterday
In a story titled “As Germans Seek News, YouTube Delivers Far-Right Tirades”, story writers Max Fisher & Katrin Bennhold shed light on a scary piece of infrastructure underpinning Alphabet’s YouTube business — a news recommendation engine (think algorithm) designed to “give people video suggestions that leave them satisfied” where the abstraction “satisfied” appears to be defined by “[r]esearchers who study YouTube” as “politically extreme content” with a “keep them there” kicker. The aftermath of the Chemnitz event leads credibility to the researchers notion of just how the algorithm works.
Not good.
Too bad the Times story doesn’t include links to the published research. One of the experts quoted in the story, Zeynep Tufekci has published lots on related topics, including “YouTube, the Great Radicalizer” for the Times.
Remembrance of Things Past (thanks Marcel!)
When I was a graduate student at SUNY Stony Brook back in the early 1970s I moonlighted as an information gatherer for a media research consulting firm run by two Stony Brook professors from the Psychology Department. My job was to watch TV shows like “Mission Impossible” (yes, there was a TV show before there was a movie) and tag the frequency of violent activities depicted in the shows. Why? Because clients of this consulting firm included some state actors who wanted to put together menus of video content for soldiers, to keep them in a “ready to go” state.
So I can personally attest to a correlation between the frequency at which average folks watch extremist content and the possibility of their behaving in a similar manner.
Once again, not good.
So what does management at Alphabet/Google have to say about the possibility for bias, and, at best, misconfiguration, for the big algorithm beneath YouTube’s recommendation engine? We don’t know. Despite an invitation from the US Congress to appear last week, no one from Alphabet/Google showed up.
What about line management at YouTube? What do people, ostensibly in a decision-making role have to say about whether or not YouTube will fix the problems with the algorithm? Yesterday’s Times story quotes an unnamed YouTube representative: “…the company [plans] to work with news publishers to help “build a better news experience on YouTube.””
The situation looks like it needs correction right away. “Plans” may or may not materialize. A fix is urgently required.
