Sitemap
Doublethink Lab

Doublethink Lab focuses on mapping the online information operation mechanisms as well as the surveillance technology exportation and digital authoritarianism.

Doublethink Lab and Voice of America: US Election Joint Observation Mission Final Report

--

Jasper Hewitt / Analyst, Digital Intelligence Team, Doublethink Lab
Elena Yi-Ching Ho / Consultant Analyst, Doublethink Lab

Thank you to our part-time staff and interns for their dedication and hard work: Chip Chu, 1T, Jim.

1. Key Findings

  1. Doublethink Lab and Voice of America (VOA) have observed 315 Spamouflage accounts on X between May 1st, 2024 and November 6th, 2024. During the observation period, these accounts made 4,671 posts and 18,198 reposts. 90 of these 315 accounts were suspended and 5 are temporarily restricted, after VOA’s reports.
  2. During the observation period, apart from election-related content, the identified Spamouflage accounts also sought to degrade the United States by criticizing US foreign policy in general, predominantly US support for Israel, as well as specific domestic issues such as homelessness, drug abuse, and gun violence.
  3. The Spamouflage accounts were observed to degrade both presidential candidates, making it hard to conclude that they were operating in favor of either candidate.
  4. Most of the observed accounts did not introduce any novel content but amplified existing narratives using images and arguments that they found online.
  5. We also discovered two Spamouflage networks that impersonated Donald Trump supporters. Elise Thomas from International Strategic Dialogue (ISD) first discovered these ‘Magaflage’ accounts and interpreted them as Spamouflage’s latest attempt to reach authentic users. One of the networks that we discovered also managed to break out of its bubble and reach at least several authentic users.
  6. In the days leading up to the election, we discovered one campaign that attacked three specific lawmakers who have been critical of the People’s Republic of China (PRC) in the past.
  7. On TikTok, we observed one pro-PRC account that impersonated a large US media outlet and made 50 posts directly related to the elections. The account’s content was similar to the content posted by Spamouflage on X. Several of the account’s videos went viral and reached authentic users.

2. Methodology

In this project, we tracked 315 inauthentic accounts on X that we assess to be part of “Spamouflage,” a massive, inauthentic, spammy, cross-platform propaganda infrastructure that was attributed by Meta to PRC law enforcement. In addition, we also tracked 197 suspicious accounts on TikTok that post a substantial amount of pro-PRC content. Between September and November, we used quantitative methods to track the highest-performing content on these two platforms. In addition, we conducted several in-depth case studies in coordination with VOA.

3. Spamouflage Monitoring on X

3.1 Methodology

This section is divided into two observation periods. First, we conducted a narrative analysis of content posted by 10 key seeder accounts between May 1st and November 6th, 2024. These key seeder accounts were chosen based on their view count and the number of amplifications by other Spamouflage accounts on our tracking list. A full overview of the accounts can be found in the table below. The 11th account (@sisterMuxi) is a backup of @fo29904619 that became active after @fo29904619 got suspended.

Table 1: The top 10 key seeder accounts.

We manually categorized the 1,141 posts (1,093 posts and 48 reposts) from the 10 key seeder accounts into eight categories. The eight categories were based on our own observations and suggestions from VOA:

  1. US support for Israel: Posts related to US support for Israel. This also includes posts that highlight pro-Palestine protests in the US.
  2. Israel-Palestine: Any posts that discuss the Gaza war that do not directly mention the US.
  3. US hegemony: Posts that criticize US foreign policy for being hegemonic or interventionist.
  4. Promote or defend the PRC: Posts that promote the PRC or directly refute international criticism. This also includes posts that compare the PRC to the US.
  5. US domestic issues: Any posts that discuss domestic issues in the US. This category is further subdivided into: drugs/homelessness, economy, gun violence, racial inequality, immigration, and other issues.
  6. US election/democracy: Any post that discusses the presidential election or highlights the campaigns of the presidential candidates. This category includes many posts that ridicule the candidates. In addition, it also includes any posts that criticize the state of US democracy.
  7. US support for Ukraine: Any posts related to US support for the war in Ukraine.
  8. Other: Any other posts that do not fall within the categories displayed above.

The second observation period ran from October 17th, 2024 to November 6th, 2024 and focused specifically on content related to election fraud, political unrest, or claims that the US elections are only about money. The latter was a major narrative that PRC state media started pushing right before the elections. For this period, we considered all of the 3,230 posts between October 17th and November 6th. We first ran a keyword search to find any posts specifically related to the elections. Subsequently, we manually reviewed those posts to verify if they are in fact related to one of the three categories (footnote 1).

3.2 Description of indicators of inauthenticity and attribution judgments

We are certain that the 315 accounts on our tracking list can all be linked to Spamouflage for the following reasons.

  • Virtually all of the accounts on our tracking list occasionally share or repost pro-PRC content on topics such as Taiwan, the South China Sea, Xinjiang, Tibet, or Covid. This content sometimes contains Chinese language and is occasionally directly taken from PRC state media outlets such as China Daily or the Global Times.
Figure 1: Post sharing a China Daily cartoon that indicates Tsai Ing-Wen is selling out Taiwan to the US.
  • At least 17 of the seeder accounts on our tracking list were amplified by a large network of 1153 crypto/web 3.0 repost boosting accounts. These same crypto accounts also amplify (semi-) official accounts that are tied to the PRC government, such as the account for the Jinan International Communication Center (@Jinan_ICC), an account about Xi Jinping operated by the China Daily (@XisMoments), and the official account of the Xiamen City government (@china_xiamen). VOA published an article about this finding on October 29th.
Figure 2: (Semi-) Official CCP accounts that were amplified by the crypto network.
  • A large group of amplifier accounts almost exclusively reposts specific seeder accounts. 259 out of the 315 accounts on our tracking list have a repost percentage over 70%.
  • Over 30 accounts in our dataset match Spamouflage naming patterns first discovered by other researchers at ISD. Many of the accounts in our database have Usernames with (fo) or (互fo) to indicate their participation in a follow train. In addition, our dataset contains another 7 accounts that match the specific user handle pattern of ‘fo’ followed by 8 digits.
  • One of the prominent seeder accounts, @mmshouhu, is followed by two prominent Chinese diplomats with a large following on X: Zhang Heqing (@zhang_heqing) and Zhang Meifang (@CGMeifangZhang).
Figure 3: Evidence that Zhang Heqing and Zhang Meifang follow @mmshouhu.
  • Many of the accounts on our tracking list posted iconic photoshopped Spamouflage images that have often been associated with Spamouflage by other researchers. In some cases, the accounts we discovered even posted the exact same images that were previously identified by other researchers. The image below was also featured in a Graphika report that was published in September.
Figure 4: Two separate Spamouflage posts sharing a photoshopped image of homeless people in the US.

3.3 Narrative analysis between May and November

3.3.1 Overview

The majority of Spamouflage posts from the key seeder accounts were most concerned with degrading the image of the United States rather than directly interfering in the US elections. Only 154 out of the 1141 reviewed posts (13%) were about one or more presidential candidates. When the posts did mention one of the candidates, Biden was mentioned most (99), followed by Trump (73), and then Harris (30). Biden was mentioned most often because he is the sitting president.

Figure 5: Posts per candidate.

The tracked accounts were mostly amplifying existing narratives rather than creating their own. For example, their criticism of US support for Israel, Biden, Trump, and Harris mostly echoed existing narratives that are circulating widely on the internet. In addition, most of the images that the accounts posted were already circulating widely online and were not created by the Spamouflage accounts in question. Many of these images were copy-pasted from content taken from pro-Palestine accounts or the Global Times. The only exception to this is the category ‘US domestic issues.’ This has been a target area for Spamouflage for a longer period, and some of the images appear to have been created by Spamouflage itself.

3.3.2 Classification

Our manual classification revealed that the 10 key seeder accounts were very critical of Israel and the United States’ support for Israel. Many of these posts showed graphic content of the death and destruction in Gaza and accused Joe Biden and the United States of being partly responsible for it. In addition, they often pointed out the violent crackdown on the pro-Palestine protesters or asserted that Israel is secretly controlling the US government.

Figure 6: Dominant narratives for top 10 seeder accounts.

The presidential candidates were also occasionally criticized for their stance on Israel. Harris, Trump, and Biden appeared in 12, 16, and 29 of these posts, respectively. The posts about US support for Israel far outnumbered the number of posts that criticized US support for Ukraine, despite it being an issue that Trump and Harris disagreed on. The image on the left below shows an example of an AI-generated image that implies that both Harris and Trump serve Netanyahu. The account in question did not create this image, as it has been circulating widely on social media. Most of the AI-generated content copy-pasted by the Spamouflage accounts aimed to visualize an idea rather than mislead the audience about specific events.

Figure 7: Two posts criticizing the presidential candidates for their support for Israel.

Spamouflage’s criticism did not stop at the United States’ policies on Israel and Ukraine. There were also a significant number of posts that criticized the US for its hegemonic behavior around the world. The accounts specifically criticized the US for interventionism in the Middle East, South America, as well as in the Asia pacific. The accounts often shared Global Times cartoons that criticize US military presence in the South China Sea.

The Spamouflage accounts were also critical of domestic issues in the United States. This category can be further subdivided into drugs/homelessness (44), economy (23), gun violence (21), racial inequality (19), immigration (9), and other issues (10). The accounts in question had been posting about these topics since early 2023, and it was only after the war in Gaza started that they shifted their main focus away from US domestic issues. It is also in this category that we saw the signature photoshopped images that researchers have often associated with Spamouflage. The accounts would occasionally still point out some more contemporary domestic issues. Some posts pointed to specific protests by dockworkers and claimed they were a sign of a failing economy.

Figure 8: Example of photoshopped content that has often been attributed to Spamouflage.

There were 109 posts that were more directly related to the US election or democracy. These posts mostly leveraged existing narratives that ridiculed or criticized the presidential candidates. Biden was often criticized for being too old or taking long holidays, whereas Trump was ridiculed for the many lawsuits he is facing. Harris was called a liar and a puppet for the democratic party. The accounts were quick to reply to the assassination attempt on Trump. Between July 14th and 16th, the key seeder accounts created 14 posts that referred to the event. Some of these posts merely pointed out what happened, whereas others ridiculed gun policies in the US, mocked Trump, or insinuated that Pelosi or other secret actors were behind the attack.

Researchers at ISD pointed out in February that Spamouflage accounts were creating original Photoshopped and AI-generated images displaying Trump and Biden. The accounts on our tracking list were showing similar behavior throughout 2023 and early 2024 (see images below).

Figure 9: AI generated images posed by Spamouflage accounts.

In the months leading up to the elections, the accounts on our tracking list switched to copy-pasted, existing content in the months leading up to the elections. The two images below are examples of copy-pasted content that ridicules the presidential candidates. It is unclear why the Spamouflage accounts changed their strategy.

Figure 10: Two posts ridiculing the presidential candidates.

152 posts promoted or defended the PRC against international criticism such as doping allegations during the Olympics, or human rights abuses in Xinjiang and Hong Kong. There are several posts that aimed to show how advanced the PRC is compared to the United States on issues such as public transport. The accounts were also fervent supporters of Black Myth: Wukong and PRC tech in general and criticized the US for leveraging subsidies under the pretense of overcapacity. The accounts also presented the PRC as a more reliable alternative to the US. For example, after the pager bombings in Lebanon, there were several posts that claimed iPhones are no longer safe and that people should consider Huawei instead.

Figure 11: Content that both criticizes US support for Israel and also promotes Chinese technology.

3.4 The final weeks of the election

The Spamouflage accounts on our tracking list made a total of 3,230 posts between October 17th and November 6th, 2024. The majority of these posts aligned with the narrative analysis outlined above. Our keyword search and additional manual verification revealed 155 instances (29 posts and 126 reposts) that touched upon our categories of interest.

Figure 12: Posts per category.

As can be seen in the time series analysis below, 143 of these posts and reposts were shared before November 1st. The final days leading up to the elections were relatively quiet.

Figure 13: Time series of specific categories between October 18th and November 6th.

In total, there were 76 instances (15 posts and 61 reposts) of content that made election fraud allegations. Virtually all of these posts amplified existing content that was already circulating in the days leading up to the elections. This includes content about fraudulent voting machines, burning ballot boxes, and ‘buses’ full of illegal voters wearing Harris Walz stickers. Virtually all of these posts were made between October 22nd and November 1st.

There were 72 instances (9 posts and 63 reposts) of content that seeked to degrade US democracy through claims that the elections only revolve around money. These posts seeked to convey the idea that the US is only democratic on the surface and that the wealthy families who fund the presidential candidates control every aspect of the elections. Virtually all of these posts were made on October 31st.

There were 7 instances (2 posts and 5 reposts) of content that predicted or highlighted potential political unrest after the elections. Several of these posts highlighted legitimate polls that showed US citizens are afraid of violence after the elections. One other post predicted riots and a final one highlighted the possibility of a civil war. Screenshots of all three categories can be found below.

Figure 14: Three Spamouflage posts from the final days leading up to the election.

In line with most of the content discussed above, virtually all of the election fraud posts amplified existing narratives and shared existing content that circulated widely online and did not originate from the PRC. However, The ‘money elections’ posts are an exception—one of the shared images originated from the Global Times. In addition, two AI-generated images and the script behind a video that both assert that the elections are a cover for the wealthy to push their own agenda can both be traced back to a Baidu post by 熊哥滴滴滴 that was posted on October 21st, 2024 (in Chinese). The video is an edited version of several publicly available stock videos. The script is a direct translation of the first two paragraphs from 熊哥滴滴滴 Baidu post, the title of which reads: ‘The US elections: A game of manipulation for the wealthy.’ It is highly likely that the Spamouflage account used AI-based translation and text-to-speech software to turn the original Chinese script into spoken English. The two AI-generated images and video were uploaded to X by two different seeder accounts that do not typically repost each other.

Figure 15: Three Spamouflage posts that can all be connected to a Baidu post by 熊哥滴滴滴.

3.5 Impact

We believe that the accounts mentioned above score a category 2 out of 6 on Ben Nimmo’s breakout scale. The campaign was limited to one platform (X) but did manage to break out of its original bubble. On average, the 10 key seeder accounts gained 4,714 views per original post (not a repost). One post managed to go viral and gain more than 40K views. In addition, we discovered several viral posts that have disagreeing replies from accounts that appear to be authentic. This indicates that the Spamouflage accounts in question occasionally managed to break out of their own bubble and reach authentic users.

4. X Case Studies

4.1 Magaflage

During our election observation period, we discovered three Spamouflage networks that pretended to be American Trump supporters. This phenomenon was first discovered by ISD, which led them to coin the term ‘Magaflage.’ One of the networks has already been covered by VOA. We disclose the second, and largest one, in this report. The third Magaflage network is still under investigation and will be published in a separate report.

The largest MAGAflage network we discovered consists of 15 inauthentic accounts who predominantly amplified content seeded by @originalcfn (Sissy YY, formerly Original YY). The accounts were actively posting about the elections for a short period between June and July, 2024. In total, the 15 accounts (excluding @originalcfn) made 465 posts and 1,265 reposts. 1,093 of these were reposts of @originalcfn. Even though the Magaflage network stopped amplifying @originalcfn’s content after July, the seeder account continued its pro-Trump content until the time of writing (November 15th). A full overview of the network can be found in the table below.

Table 2: The Magaflage network.

The accounts in this network show a high degree of inauthentic coordinated behavior:

  • All of the accounts use profile photos that circulate widely on the internet.
  • All accounts use emoticons in their usernames. 12 out of the 15 accounts use the American flag as one of the emoticons.
  • All accounts actively reposted @originalcfn. For most accounts, reposts of @originalcfn made up more than 70% of their posts.
Figure 16: Four Magaflage accounts that tried to impersonate US citizens
  • As can be seen in table 2 above, 9 out of the 15 accounts were created within an 11 minute time span on July 8th, 2022.
  • These same 9 accounts also made their first post within an 11 minute time span on the same day.
  • The first 9 posts for the same 9 accounts were random quotes, often posted on the same day.
Figure 17: Example of the first few posts from the Magaflage accounts.
  • The 9 accounts that were created on the same day have only a handful of followers. The other 6 accounts that were created earlier have more followers. They are likely hijacked inactive accounts that still had a small follower base, just like @originalcfn itself.
  • After these initial posts, the accounts went ‘to sleep’ and awoke to start amplifying @originalcfn’s election content in June 2024.
  • This amplification also happened robotically in batches. The accounts reposted multiple @originalcfn posts on the same day. The graph below shows that the 15 accounts reposted @originalcfn 424 times on June 19th, 2024.
Figure 18: Robotic batch reposts of @originalcfn by Magaflage network.
  • Compared to regular Spamouflage accounts, the Magaflage accounts were relatively good at hiding their affiliation to the CCP. However, there were several clues that still gave the network away.
  • On August 21st, 6 out of the 15 accounts suddenly shared a Chinese language New York Times article that criticizes Falun Gong. It is important to note that these are not reposts. They were posted by the accounts themselves. 5 of the 6 posts were in Chinese, even though all the other content posted and reposted by the network to date has been in English. This is also the first time that the network posted content that can be perceived to align with PRC state interests.
  • One of the accounts had Hong Kong set as its location.
  • One of the other accounts still had Chinese language content in its first post.
Figure 19: Indicators that link the Magaflage accounts to the PRC.
  • All of the 1,093 @originalcfn reposts made by the Magaflage accounts were done on weekdays during Beijing working hours. The graph below shows the time distribution of the reposts.
Figure 20: Hourly number of @originalcfn reposts by Magaflage network.
  • @originalcfn was also relatively good at hiding its affiliation with the CCP. However, we discovered one post that alleged the US is selling weapons at high prices to Taiwan and Ukraine. This post was reposted by several other Spamouflage accounts like @lispvaxshila. In addition, we found that @originalcfn was also amplified by at least 911 different crypto/web 3.0 accounts. At least 283 of these crypto/web 3.0 accounts also amplified other Spamouflage accounts and official CCP accounts like @Jinan_ICC and @XisMoments.
Figure 21: Early @originalcfn post criticizing arms sales to Taiwan and its corresponding list of reposters.

In terms of narratives, the network promoted Trump, criticized Biden, and amplified existing conspiracy theories. The screenshots below are a case in point. The first post on the left shared a Breitbart article about mail-in-ballots and made the argument that the Democrats got away with stealing the election in 2020, and are planning to do it again in 2024. The second post alleged that Pelosi admitted that she was responsible for January 6th. The third post amplified Trump’s standpoint that the US should reduce funding for Ukraine. The fourth post idolized Trump by showing several clips that are supposed to demonstrate how charismatic he is. The fifth post criticized Biden’s immigration policies. All of the posts shown below were amplified by all 15 accounts in the Magaflage network.

Figure 22: Examples of @originalcfn posts that were amplified by the full Magaflage network.

In terms of impact, the campaign falls in category 2 out of 6 on Ben Nimmo’s breakout scale. The campaign was limited to one platform (X) but did manage to break out of its original bubble @originalcfn’s posts gained only 1,247 views on average. As can be seen in the screenshot on the left, many of the replies in the comment section came from inauthentic accounts that merely echoed the original message. However, we also discovered replies from what appear to be authentic users who disagreed with the original statement, as can be seen in the image on the right. This indicates that @originalcfn did manage to break out of its bubble several times.

Figure 23: Two @originalcfn posts and their corresponding replies.

4.2 The attack on the lawmakers

In the final days leading up to the elections, we discovered at least four Spamouflage-linked accounts that criticized two Republican and one Democrat lawmakers: Congresswoman Young O. Kim, Speaker of the House Mike Johnson, and Congressman Ami Bera. These three lawmakers have all expressed criticism of the PRC in the past. The account made 11 posts between October 18th and October 31st and was banned shortly after.

Figure 24: One of the inauthentic accounts that attacked the lawmakers.

We are almost certain that the account is inauthentic and connected to the CCP for the following reasons.

  1. The account was created in October, 2024 and has 0 followers. Nevertheless, the account managed to get over 3,000 views and 110 reposts on some of its posts.
  2. The profile photo can be found on a Chinese stock photo-like website.
  3. Virtually all reposts come from crypto boosting accounts. In fact, one of the amplifier accounts is the same crypto account that also amplified other Spamouflage accounts on our tracking list and the official account of the Jinan International Communication Center.
  4. In the last week, we have discovered at least four other accounts that posted exactly the same content. These accounts were still online as of November 21st, 2024.
Figure 25: One of the crypto amplifier accounts that reposted @koonceMary98399. The same account also amplified other Spamouflage and official CCP accounts.

In terms of narratives, the account made several posts that criticized Mike Johnson and Young O. Kim on their anti-LGTBQ+ stance. The account made allegations about how specific legislation supported by Mike Johnson may impact healthcare for transgender youth. The post also mentioned congresswoman Marjorie Taylor Greene but was mostly targeted at Mike Johnson. In another post, the account mentioned that congresswoman Young O. Kim ‘voted against prohibiting state agencies from awarding contracts over $100,000 to companies that discriminate based on gender identity.’ This pro-LGTBQ+ stance is unusual for Spamouflage accounts. Our manual review of the 10 key seeder accounts above found at least 22 posts that ridiculed or criticized the LGTBQ+ movement. In addition to LGTBQ+ issues, the accounts also criticized Young O. Kim for her stance on abortion. The image below appears to be a photoshopped version of a photo that was taken at a pro-life demonstration. Finally, the account criticized Ami Bera for his stance towards the 1984 anti-Sikh riots in India.

Figure 26: Three posts attacking lawmakers who have been critical of China in the past.

In terms of impact, the campaign falls in category 1 out of 6 on Ben Nimmo’s breakout scale. The campaign was limited to one platform (X) and did not manage to break out of its original bubble. Even though the account managed to gain over 3,000 views on some posts, we were unable to find evidence that the content reached any authentic users.

5. TikTok Case Study

Compared to X, monitoring and research on TikTok is much more laborious and time-consuming, as a reliable API is not available for research purposes. For TikTok monitoring, we largely relied on snowball research where we targeted a small number of accounts and then drew their connections with the broader information ecosystem. We have discovered several suspicious accounts that we believe to be connected to the PRC but not all of them directly comment on the US election. In this section, we share the most relevant account for the US elections.

@usatoday01

On TikTok, we discovered one account (@usatoday01) with relatively large followings that impersonated a US-based mainstream media outlet: USA Today. @usatoday01 made 275 posts between June 13th, 2023 and July 16th, 2024. The account was either suspended or deleted shortly after VOA reported on it on October 18th, 2024. During this period, the account accumulated 10.6K followers and 86.4K likes. We believe the account was inauthentic and connected to the CCP for several reasons.

Figure 27: The fake USA Today TikTok account.
  1. The account attempted to impersonate the real @usatoday account. Apart from picking a similar name, it also used a similar logo and account description. A USA Today spokesperson confirmed to VOA that @usatoday01 is a fake account that was not connected to the news agency.
  2. According to our OSINT tool, the account was operated from inside the PRC. This is relevant because TikTok is blocked in the PRC, indicating that the accounts might have been given permission to access blocked websites.
  3. The content posted by the account showed a large overlap with the Spamouflage accounts we discussed in the X section above. @usatoday01 made several posts that criticize US support for Israel, point out US domestic issues such as gun violence, hype Chinese technology, and refute foreign criticism on TikTok, the Chinese economy, or human rights abuses in Xinjiang.
Figure 28: Three TikTok posts by the fake USA Today account that align with content posted by Spamouflage on X.

In terms of narrative, we found that the account posted at least 50 videos that mentioned at least one presidential candidate. Biden, Trump, and Harris were mentioned 38, 19, and 5 times, respectively. The account’s last upload was before Biden formally announced his withdrawal from the race, which explains the lower number of posts about Harris. The posts about Harris only discuss the possibility of her taking over the candidacy.

Similar to the Spamouflage posts on X, @usatoday01 amplified existing narratives that ridicule and criticize all presidential candidates. Our review found at least 13 posts that assert Biden is too old, has dementia, or has Parkinson’s disease. The account frequently uploaded videos that appeared to show Biden zoning out or misspeaking to emphasize that he was no longer fit for office. One of these videos obtained over 72K views. Furthermore, the account criticized Biden for his support for Israel and his tariffs on Chinese products. In one post, the account asked: ‘Why are Biden and Harris still doing TikToks if they think it’s a national security threat…?’

@usatoday01’s post about Trump mostly amplified narratives surrounding the lawsuits he is facing. Our review found 8 posts that highlighted Trump’s legal issues. For example, on May 31st, 2024, the account took the opportunity to highlight that Trump had become the first US president convicted of a crime. On June 11th, 2024 the account reported on Trump’s pre-sentencing interview in New York. Other posts about Trump highlighted his skepticism for support to Ukraine and his statements about declassifying Epstein’s files, two recurring themes in PRC information operations.

Figure 29: Three TikTok posts by the fake USA Today account that discuss presidential candidates.

Aside from posts about specific candidates. The account made over 20 posts that criticize US support for Israel and another 7 posts that criticize the police crackdowns on pro-Palestine demonstrations at US universities. Furthermore, @usatoday01 occasionally reported on other polarizing issues such as LGTBQ+ and immigration. On June 28th, 2024, the account posted about a controversial decision by New York’s Nassau County’s decision to ban trans girls and women from public sports facilities. In addition, after the Super Bowl parade shooting in Kansas City on February 14th, 2024, the account amplified disinformation that an illegal immigrant named Sahil Omar was behind the shooting, which has been refuted by several news outlets including AP and the BBC.

In terms of impact, @usatoday01 scores a 2 out of 6 on Ben Nimmo’s breakout scale. The campaign was limited to one platform (TikTok) but did manage to break out of its bubble. Even though the campaign only obtained 9,462 views on average, there were several posts that went viral and broke out of the account’s initial bubble. One of the posts that ridiculed Biden obtained 76K views. Another post that compared Boston Dynamic’s (US) robot dog to a similar robot made by Unitree Robotics (PRC) obtained over 423K views. The comment section of this post showed several comments from authentic accounts that disagreed with the intended message of the post, indicating that the post broke out of its inauthentic bubble.

Figure 30: A Sarcastic comment under one of the posts by the fake USA Today account.

7. Conclusion

Apart from a few incidents (Attack on the Lawmakers, Magaflage), it appears that the accounts that we observed on both X and TikTok were more focused on painting an overall negative picture of the United States rather than specifically focusing on the elections. Posts about the presidential candidates or the election as a whole only made up a small portion of the total content observed. Instead, many of the Spamouflage timelines were dominated by content related to the war in Gaza. The accounts preferred to amplify existing narratives and use existing content over creating their own. This was especially true for the content related to Israel. Our most interesting finding is that there is a contrast between the mainstream Spamouflage accounts and the smaller ‘Magaflage’ and ‘Attack on lawmakers’ campaigns. The mainstream Spamouflage accounts were pro-PRC, anti-Israel, anti-US, and occasionally, anti-LGTBQ+. However, the accounts involved in the Magaflage and ‘Attack on lawmakers’ campaigns sometimes expressed completely opposite views just to achieve their objectives. They did not mention any anti-Israel or pro-PRC content, and in some cases expressed pro-US and pro-LGTBQ+ content. This shows that the people behind these campaigns are well aware of the political context in the United States and are not just aimlessly posting content that aligns with CCP objectives. We can expect Spamouflage to take on many different forms in the future.

8. Footnotes

1: Jan 6, January, riot, election, vote, poll, box, ballot, voting, voter, fraud, machine, steal, rig, illegal, corruption, tamper, manipulate, fake, scam, cheat, unfair, bias, absentee, spoof.

9. Acknowledgements

This research was part of a joint election observation project with Voice of America (VOA). We are deeply grateful to the VOA journalists for their invaluable contributions in shaping the overall direction of this work and for their dedicated assistance in investigating specific cases. Between September and November, VOA published 12 articles and 5 videos based on this research. The links to all of these publications can be found below.

Articles

  • Yang, Lin. China’s influence campaign intensifies as US election nears. Voice of America. September 17, 2024 (EN, ZH).
  • Ma, Wenhao. China-connected spamouflage impersonated Dutch cartoonist. Voice of America. September 19, 2024 (EN, ZH).
  • Yang, Lin. Probe finds Beijing seeking to mislead, sow distrust ahead of US election. Voice of America. September 28, 2024 (EN, ZH).
  • Ma, Wenhao. China-connected spamouflage networks spread antisemitic disinformation. Voice of America. October 4, 2024 (EN, ZH).
  • Yang, Lin. On TikTok, AI-generated ‘Russian’ women deliver pro-China messages with sales pitch. Voice of America. October 9, 2024 (EN, ZH).
  • Ma, Wenhao. Chinese spamouflage campaign highlights US support for Israel. Voice of America. October 15, 2024 (EN, ZH).
  • Yang, Lin. 直击假信息:当美国之音遇上“美国之声”,揭示TikTok和X上的假新闻账户 [When Voice of America meets “Voice of America0,” exposing impersonated news accounts on TikTok and X]. Voice of America. October 18, 2024 (ZH).
  • Ma, Wenhao. 直击大选假信息:解密中国网络“垃圾伪装”行动 [Election Disinformation Watch: Deciphering Spamouflage]. Voice of America. October 22, 2024 (ZH).
  • Yang, Lin. 直击大选假信息:中俄放大美飓风救灾错误信息,意图干扰大选 [China and Russia amplify false hurricane relief narratives to influence US elections]. Voice of America. October 27, 2024 (ZH).
  • Ma, Wenhao. Cryptocurrency promoters on X amplify China-aligned disinformation. Voice of America. October 29, 2024 (EN, ZH).
  • Ma, Wenhao. 直击大选假信息:中国虚假信息行动开始质疑美国大选的公正性 [Chinese Spamouflage accounts starts questioning election legitimacy]. Voice of America. November 3, 2024 (ZH).
  • Yang, Lin. 直击大选假信息:中国官媒宣扬美国分裂,虚假信息行动相对沉默 [Chinese Spamouflage accounts relatively quiet as state media calls out US division]. Voice of America. November 7, 2024 (ZH).

Videos

  • Ma, Wenhao. 荷兰漫画家变身中国大外宣?中国水军的家长好操作 [A Dutch cartoonist spreading Chinese propaganda? A look into Spamouflage’s playbook]. Voice of America. September 27, 2024 (ZH).
  • Ma, Wenhao. 中国网络影响力行动“瞄准”美国对以色列的支持 [Chinese Spamouflage Campaign takes aim at US support for Israel]. Voice of America. October 17, 2024 (ZH).
  • Yang, Lin. 直击假信息:当美国之音遇上“美国之声”,揭示TikTok和X上的假新闻账户 [When Voice of America meets “Voice of America0,” exposing impersonated news accounts on TikTok and X]. Voice of America. October 23, 2024 (ZH).
  • Ma, Wenhao. 究竟什么是中国的“垃圾伪装”行动?为什么专家说它很难被铲除?[What is Spamouflage? Why do experts say it is difficult to rout out?]. Voice of America. October 24, 2024. (ZH).
  • Yu, Zhou, Ma Wenhao, Yang Lin. 时事大家谈:美国之音如何直击大选假信息?中俄如何二重唱,挺中批美?[How Does Voice of America Tackle Election Misinformation? How Do China and Russia Support China and Criticize the U.S.? ]. Voice of America. November 13, 2024 (ZH).

--

--

Doublethink Lab
Doublethink Lab

Published in Doublethink Lab

Doublethink Lab focuses on mapping the online information operation mechanisms as well as the surveillance technology exportation and digital authoritarianism.

Doublethink Lab
Doublethink Lab

Written by Doublethink Lab

Doublethink Lab focuses on mapping the online information operation mechanisms as well as the surveillance technology exportation and digital authoritarianism.

No responses yet