Crafting projects, Islam, and Russian propaganda
--
By Renee Diresta and Jonathon Morgan
We spent last Friday night combing through Jonathan Albright’s dataset of posts from five Russian-linked Facebook pages, trying to track down related Instagram accounts. Most of the accounts associated with those Facebook pages have been removed, but through that research we uncovered several Pinterest pages that we now believe were tied directly to Russia’s 2016 influence operations. The first, revealed in a recent article by the Washington Post, has since been taken down, but there’s at least one more — and it’s still live as we publish this.
Muslim Voice was a fake account removed from both Facebook and Instagram. However much of their content survives on the group’s Pinterest page, which seemingly was used to harvest authentic content posted by other users. Muslim Voice’s pins are often focused on messages of Muslim empowerment or religious devotion, but scratch the surface and you’ll find the kind of divisive, polarizing content that’s a staple of Russian disinformation efforts.
But we were most surprised by what happened to our researchers’ personal Pinterest feeds after viewing Muslim Voice’s pins — Pinterest’s recommendation engine started serving up right-wing talking points and posts in Russian.
This indicates that the many of the people viewing Muslim Voice’s content on Pinterest were Russian speakers (or at least took a strong interest in Russian-language pins about crafting). To understand why, we need to unpack how content recommendation engines work. These types of algorithms, used by Facebook, Pinterest, Twitter, and other media companies, learn how to show you content that will keep you engaged on their site. They do this by keeping track of which posts you view, and comparing your viewing habits with other users who like the same content. They assume that users who look at similar content probably have similar interests, so if people like you view something, you’ll probably want to view it too.
For example, the algorithm might learn that users who look at posts about birthday cakes also look at posts about party decorations. So when you look at a recipes for a birthday cake, it knows that it should also show you posts about party decorations. In short, it figures out what type of user you are based on your past behavior, and shows you more stuff that tends to engage people like you.
Now let’s say you’re a Russian propagandist. You spend most of your time at work viewing and collecting pins about Muslim empowerment, religious devotion, and right-wing memes about US politics, but sometimes you sneak in a few minutes for yourself to check out crafting projects in your native language. The recommendation engine has learned that these three things go together — users who view posts about Muslim empowerment also view right-wing memes about US politics and sometimes view posts in Russian about crafty sewing projects. So when we go look at Muslim Voice’s pins, guess what Pinterest assumes we want to see next?
This happened fast. After a couple of searches one Friday night, the feed was transformed. And it isn’t just Pinterest — for example, these are the type of algorithms that Facebook uses to suggest flat-earth and homeopathy groups to anyone who joins a “vaccine-hesitant” group on their platform, or that Amazon uses when they inadvertently suggest bomb-making ingredients that are often purchased together. Our prior research has outlined the potential consequences of these types of algorithms in depth.
As we learn more about the massive scale and scope of Russia’s operations in the US in 2016, it’s important to note that they are exploiting systemic flaws in the platforms we use to socialize and share ideas. Russian influence is a serious issue that we need address, but the social media platforms’ obsession with user engagement is an even bigger problem. Design flaws like these, baked into features like content recommendation engines, leave us vulnerable to future attacks.