Twitter Keeps Emailing Me White Supremacist Propaganda

Daniel Malmer
The Startup
Published in
5 min readJun 17, 2020


If Twitter wants to find the Nazis on their platform, they need look no further than the emails they’ve been sending me.

Screenshot of tweet that says “The last desperate gasps of a bankrupt, anti-English, anti-white ideology.”

In November of 2019, I saw the Fred Rogers biopic, “A Beautiful Day in the Neighborhood.” Full of nostalgia, I created a Twitter account, @mrrogerssays, that I intended to use to tweet out inspirational quotations of Fred Rogers.

I wound up sending only four tweets from the account, including one that said “Hello, neighbor,” before losing interest.

I mostly forgot about the account, but a couple of emails would show up in my Social tab every day to let me know what Mr. Rogers was missing out on.

These are known as “re-engagement emails,” and they went unread for weeks. Then, one of the emails caught my eye.

On January 28th, about two months after I’d opened my account, Twitter sent me an email about a tweet from “Dave K.” Previous emails mentioned one of the four accounts that I followed, so I didn’t understand why I was getting an email about someone I didn’t know. I opened the email.

A tweet from “Dave K” with a link to a link to a climate change denial article.

I immediately recognized the headline as climate change denial. Clicking through to the story confirmed that it was. The website that hosts the article traffics primarily in climate change denial and other conspiracy theories.

I took a look at Dave’s Twitter feed. He had hardly any followers. He has 32 at the time of this writing. While he mostly spreads climate change disinformation, he also likes to tweet about “globalism” and “white genocide,” two topics common among white supremacists.

Twitter decided that he was someone I’d like to know.

That wasn’t all. In the same email was an antisemitic tweet from an account that’s primarily used to spread white supremacist propaganda.

A tweet that says “The eternal, disingenuous voice of Jewish privilege.”

These emails are designed to get users to return to Twitter. I couldn’t imagine how Twitter had decided that my Mr. Rogers account would be interested in conspiracy theories and white supremacist propaganda. I’d barely used the account, and only followed a few accounts, each of them mainstream and left-leaning.

I started digging through two months of emails to see where things went off track.

It wasn’t too difficult to find it. Less than a month after I created my account, Twitter sent me an email that included a tweet from an account called @unhealthytruth. As someone who researches disinformation, I’ve learned to be suspicious whenever I see the word “truth” in someone’s screen name. It turns out that Erin is one of the biggest anti-vaccine disinformation sources on social media. According to her Twitter bio, her “better half” is Joseph Mercola. Mercola is the largest donor of the United States’ oldest anti-vaccine organization.

Why did Twitter start emailing me her tweets? It appears to be because I viewed a single tweet of hers on December 8th, while logged in as @mrrogerssays. Two days later, I received the first email with one of her tweets. Since then, I’ve gotten emails containing one of her tweets almost three times every week.

This was already bad enough. After I clicked on Dave K’s tweet, things got far worse.

At the beginning of January, the Highlights email that Twitter emailed me would contain tweets either from accounts that I followed, or from accounts that were very similar to them. My account followed Congressman Adam Schiff, so it made sense to get tweets from him and from Speaker Pelosi.

I viewed Dave K’s tweets on January 28th. Almost immediately, the content of the Highlights emails changed from Members of Congress to a steady stream of extremists.

Emails from conspiracy theorists and white supremacists.

Some days, the emails would mix in some of the normal accounts that Twitter used to send me in the early days. Other days, they’d send me a smorgasbord of white supremacists to choose from.

It was clear that Twitter’s recommendation algorithm had discovered what would bring me back to the site, and decided to do more of that.

On February 8th, only 11 days after viewing Dave K’s, Twitter emailed me a tweet from Richard Spencer. If you’re not familiar with him, he’s a notorious right-wing extremist who’s variously described as a white nationalist, white supremacist, or neo-Nazi, depending on who you ask.

Tweet from Richard Spencer.

Spencer is barred from entering 26 European countries. He’s still allowed on Twitter, though, and Twitter’s nice enough to send me emails in case I’m interested in what he has to say.

These days, my Mr. Rogers account gets a steady stream of antisemitism, Islamophobia, homophobia, white supremacist propaganda, and conspiracy theories. The account gets a Highlights email every day, and they usually contain tweets from four different extremists. It makes me wonder how many other people are getting content like this sent to them.

How could Twitter prevent this from happening?

If they can identify extremist accounts at scale without human involvement, that’s great. Don’t recommend those accounts, either in emails or elsewhere.

If automating the identification of extremist accounts isn’t feasible, then they should blacklist large extremist accounts, and stop recommending accounts that have a number of followers below a certain threshold.

Never include an account in Highlights emails unless the user either follows that account, or has liked their tweets. Viewing an account should never be enough to trigger a recommendation.

In a 2019 interview with Rolling Stone, Jack Dorsey said about Nazis on Twitter, “if you can show them, I would love to see them.” He might want to check his Sent folder.



Daniel Malmer
The Startup

PhD student researching online hate speech, extremism, and radicalization.