Troll Armies

and the zombification of the internet

Emma Lindsay
Mar 21, 2018 · 6 min read

What’s it like to work on a Russian troll farm? A bit boring, seems like:

Aleksei [a hired Russian troll] began to realize that much of the commentary [he was hired to produce] was garbage, with the same themes repeated endlessly. “It was like turning people into zombies by repeating: ‘Everything is good, everything is good. Putin is good, Putin is good,’” he said.

New York Times, Inside the Russian Troll Factory

Of course, that’s kind of the nature of propaganda, isn’t it? Fucking boring. And yet, for the past 2 decades or so, there have been people working round the clock to pump out propaganda by pretending to be imaginary people on the internet. These hired Russian trolls would create many online profiles and pump out this state sponsored propaganda.

Initially, it was primarily pro-Russian propaganda aimed at Russian nationals — the aforementioned “Everything is good, Putin is good” that Aleski is repeating. However, when Hillary began running for the presidency, Russia began to get a little creeped out. If the NYT is to be believed, Russia thought Hillary Clinton posed an existential threat:

Mark Galeotti, a Russia expert at the Institute of International Relations, based in Prague, said the Kremlin was consumed by something more urgent than petty revenge: self-preservation.

“It’s not just they didn’t like Clinton, but they actually thought that she represented a threat,” he said, describing Russia’s actions as a matter of “policy, not pique.”

the Kremlin’s views of Mrs. Clinton go beyond defining her as hawkish. They are also layered with a pre-existing Russian belief that American democracy promotion is a ploy to unseat unfriendly governments, that the United States remains bent on Russia’s destabilization or even destruction, and that there is an American hand behind nearly every Russian misfortune.

New York Times, Russia’s Hacks Followed Years of Paranoia Toward Hillary Clinton

The New York Times describes Russia as paranoid, and yet, I’m not entirely convinced Russia is crazy. Julian Assange described Clinton as a “bright, well connected, sadistic sociopath” and he chose to support he GOP (by having wikileaks release Clinton’s emails) because he thought Cinton was more likely to get the US in a war than Trump. Now, Assange is a self-serving egotist with a fairly intolerable public persona, but he’s pretty knowledgable about international affairs. His view on Hillary Clinton is probably pretty in line with the views of non-US allies around the world.

Whether Clinton was actually hawkish and war-prone doesn’t really matter at this point; enough people with enough power believed she was, and they seem to have successfully run an internet campaign to keep her out of office. Now, I don’t know if Trump was involved (that’s what the Mueller investigation is about) and it’s not really important for the sake of this blog post.

What is important, is what the long term ramifications are for Americans culture. When Hillary started running, it seems that Russia took its pro-Russia propaganda troll farms and began turning them on America.

Apparently started in 2014, by 2016 it had really ramped up and was pretty sophisticated with its tactics:

The intent was to create an atmosphere of division and anger online. The Kremlin, the indictment says, wanted instability in America. It wanted to sway some people’s vote, and for others, particularly minorities, persuade them not to vote at all.

It seems like their main goals were to potentially tip the election in favor of Trump, but also just to sow discord and further divide the United States along existing tension lines. Their goals for POC voters was to try to not get them to vote at all, as demonstrated by the following message from a troll account called “Woke Blacks”:

“A particular hype and hatred for Trump is misleading the people and forcing Blacks to vote Killary,” one message, posted by the account in October 2017, declared.

“We cannot resort to the lesser of two devils. Then we’d surely be better off without voting AT ALL.”

BBC, Tactics of a Russian Troll Farm

Some of their fake personas were so good, that they actually got Americans to organize protests in toss up states around the time of the election:

In the process, the IRA is said to have built up a list of more than 100 real Americans who it had contacted for help in organising these real-world efforts — none of them aware they were puppets in a most audacious Russian campaign. A campaign that, as far as we know, is very much ongoing.

BBC, Tactics of a Russian Troll Farm

But, at the end of the day — as the Atlantic points out — there was nothing really fancy going on here, and perhaps nothing that should even really be considered illegal. Reddit, for example, created a ton of fake users to populate their website in the early days. This is kind of standard internet startup fare.

What is more significant, however, is that it seems to have worked. Just like Reddit was able to lure people in with its fake users, these trolls were able to influence people’s behavior in the real world by posing as real people.

This is the face of the new propaganda, and it is not limited to Russians. The internet is full of imaginary people with fake messages that are being used to serve the interests of powerful entities. And, almost always, the interests of these powerful entities are at odds with the interests of the individuals whose behavior they are trying to control. Consider the example of “woke blacks” — when you convince someone not to vote, this is a way of taking their power away from them. It was absolutely not within the best interest of black people to sit out this election (at the very least, showing up and voting for a third party candidate would be better than not voting.) However, that message was actually somewhat convincing — at least to me. Maybe it wouldn’t work on most people, but if it even worked on 1% of people, that could have been enough to make a difference.

The future of propaganda isn’t going to look like it’s coming from from some “big brother” type entity; it’s going to look like it’s coming from people just like you. It’s going to sound like it’s coming from people who get your issues, and understand your pain and frustrations. It’s going to speak to issues that hit all your hot button topics, because they’re harvesting all your data, and they going to know how to trigger you.

The only recourse, is to think for yourself. Take your time, don’t be goaded into anything. Always try to make sure you are acting in your own self interest, and when someone is trying to convince you to do something, maybe ask yourself the question “how does this benefit me?” You can even start right now — do you believe that if you act on the suggestions I am giving you, your life will be improved? If not, please don’t do them! But if yes, read on.

The nature of propaganda is to try to get you to act against your personal interests to support a regime that wants to oppress you. And, there’s no shame if you get fooled sometimes; it’s happening to a lot of people right now. Just, if you see you get tripped up, try to get back on track, and figure out the course of action that will benefit you while harming the fewest other people. There are just two simple questions to keep in mind — how does this help me? Will doing this hurt anyone else?

I believe this is an important topic right now, so I may spend a bit of time diving into the nature of propaganda on my medium, and possibly answering some questions about it on my (new, and very much in progress) site

Emma Lindsay

Written by

Facebook: — Twitter:

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade