Post-Truth Practices Online and the “useful idiots”

Elena Gk
Find Out Why
Published in
4 min readJan 13, 2022

An interview with Maryia Ditchkowska

Lecturer at the European Humanities University in Lithuania, Maryia Ditchkowska talks about the post-truth practices in political communication and their digital manifestation during events of great uncertainty.

Continue reading or listen on Spotify

Q: What does “post-truth” mean in political communication?

The term describes the presence of multiple truths, and not just one factual truth. This concept justifies the existence of different narratives and different understandings of the same situation.

Post-truth practices can be used as a psychological tool to confuse those who are engaged with more emotional responses or those who identify with closed communities.

Q: How is misinformation in social media explained in this post-truth context?

Misinformation happens when people unknowingly share disinformation on social media. I think of it in terms of the expression: “the useful idiot.” By spreading lies unknowingly some individuals become the “useful idiots” for the disseminators of disinformation.

Q: What happens once the person realizes that they have just contributed to spreading a lie?

They usually do not realize it. Unless they are explicitly trying to gather new information or to get other points of view, many individuals often think they are doing a good thing by sharing an article with their network. They often exist in their own bubble ideologically or politically.

Q: Do you think big tech companies do enough to prevent misinformation from spreading on their platforms?

Although there are many disclaimers of misinformation throughout social media, they are usually activated by English words, like “COVID” or “vaccination.” When it comes to other non-English speaking communities, there is a lack of well-integrated tools.

Q: What changes would you like to see in the research of this field?

I participate in a think tank that focuses on disinformation patterns in the Baltic States and Poland, and this is something we have discussed quite a few times.

There are really good initiatives involving media literacy campaigns, interactive games or tests, and other awareness campaigns. However, for individuals to participate in those initiatives, they have to be interested in the subject matter in the first place.

How can those initiatives reach the people who do not question themselves, who do not question their sources, or who do not listen to alternative points of view?

Q: A common view regarding disinformation — the intentional spread of lies- is the concept of “propaganda.” In your work, you have identified this term as problematic with regard to the digital information ecosystem. Why is that?

The problem with the term of propaganda is the historical context that can be attached to it. When we are talking about propaganda, we are often thinking of the Cold War. It is a 20th-century concept, where propaganda is mostly understood as mal-information distributed by official state-run media.

Today, access to different information has evolved. A traditional propaganda campaign coming from the state media would not have the same effect because there is so much alternative information available.

At the same time, different tools and different approaches to disinformation have evolved. They now include a more fine-tuned process that relies on online dissemination.

The term propaganda has, in a way, outlived its usefulness. It cannot contain the evolution of our society and our political communication.

Q: How was online content regarding the pandemic influenced by post-truth practices?

The pandemic was — and still is- a massive challenge for citizens, governments, and medical response. Due to those challenges, fertile soil for conspiracies and alternative explanations was created.

An interesting example is the case of a very elaborate and complex campaign by the Russian state media to devalue western vaccines. Their general population acted as misinformation agents but with a different outcome than the interest of the state media. Instead of creating a strong preference for their own vaccine, they activated a general vaccine hesitancy that also created an anti-vaccine issue for their government.

Q: In your research are there other instances of failed disinformation attempts by state actors?

After the Belarus elections, there was a complete internet shutdown in the country for three or four days. However, people used VPN and Tor to get access to the information they were looking for and organize themselves locally. Nobody can fully control the media space, not even governments. For a lot of people having their media access restricted encourages their curiosity.

Q: What platform did the protestors use in Belarus to communicate?

The platform used mainly was Telegram, which at the time took additional steps to ensure user anonymity so that the government would not be able to crack into their accounts, even in the case of arrests.

Q: What is your research interest currently as a lecturer at The European Humanities University?

I am very interested in the psychological implications of the representation of events in different types of media.

To give you an example, when l focused on the representation of the Belarussian protests in mainstream Russian media, I analyzed three distinct groups of media: official/state media, independent media, and Western media in the Russian language (such as Deutsche Welle, Euro News or BBC News).

I have observed that the state media narratives were focusing on legislation and kept a detached representation of the protests. However, the independent media reports had a more emotional tone, focusing on the violence that the protestors experienced, and the solidarity among them.

--

--