The Aesthetics of Misinformation.

Zena Corda
Designing Fluid Assemblages
4 min readDec 3, 2020

An exploration in developing new forms of Aesthetics of Misinformation provoke negotiation and reflection.

Introduction

This project looks into the relationship of Misinformation and freedom of speech in the space of social media. Social media has become the most relevant and accessibly way that people then to access the latest information and news. The problems and situations that are being uncovered in this space are huge as it also considers other factors like censorship of information, ethics, content moderation, and security and wellbeing of the people accessing it online. But in this time of excessive availability of information, we are struck with mass amounts of misinformation that we come across. This information may spread intentionally through trolls, fake accounts, bots, or even unintentionally through naive people that use various digital platforms.

Through this project they have tried to understand the situation of misinformation in social media platforms, what companies are doing to curtail it, While critically looking at their approaches to look at alternative ways through which we can understand and design for Misinformation, why its created, the impact on people. Ultimately the project moves into the space of designing for misinformation and how design methods might help us ideate to create different forms of expression for misinformation. During the course of this exploration, they have tried to look at Misinformation as a thing with a goal and explored various situations that can help talk about this subject to ultimately create new opportunities and understandings of how we can design for it and how it can be expressed to create a safer and informed society.

Aim

The goal of these explorations is to understand what the aesthetic of misinformation should be and how can that aesthetic help create a new form of interaction that can enable us to experience information in new ways which could be alternative ways of protecting people that encounter such information while also provoking people to reflect before they put information online.

This project is interested in finding new forms of expression for Misinformation. Rather than deleting, blocking, or censoring the information. Empower people to know what it is they are looking at and is it trustworthy while also shifting some of the responsibility from the social media companies to the people accessing the information.

Can social media platforms help people to be more reflective about what they?

Can Misinformation have Aesthetic qualities that provoke us to think about the quality of information that we are accessing?

Instead of misinformation spreading fear can it have an agency to express its authentic self?

As all these questions are challenging to answer and as I have navigated through this project I came to the conclusion that there is no one perfect solution to how speech should or shouldn’t be used in social media with regard to Misinformation and my project is more to amplify what does this speech say or represent and how can people make informed decisions when viewing that speech.

The overall approach that this project takes on is that of an explorative process. Since this topic cannot have clear answers thus, there need not be one solution to address the subject. This process and project aim to provoke conversations about how to design for such a complex experience should exist.

Disclaimer***

This project starts to open up questions about content moderation. As I had to narrow down the scope I explicitly wanted to work with form and expression and what experiential qualities can that bring about.

Experiment 1- The Glitch

Social media is built on the foundation of freedom of speech and expression but sometimes because of misinformation is just noise cluttering our interfaces, causing confusion and distress.

This exploration challenges that freedom and sets boundaries for users. What if you could post what you wanted but it came with a consequence. Platforms can enable you to post the content of your choice but that doesn’t mean the harmful content needs to get the publicity you desire.

What if you could experience the noise in your information and had to decide between tuning out the noise in your post and the amount of visibility your post can have?

How does it work?

Once the system detects misinformation it creates a distortion effect to the post. The user then has the choice to select between the clarity of the post or the amount of visibility the post will have among his/her followers. The lower the distortion, fewer people have access to the.

This exploration is done in the form of a new product launch by Twitter.

Experiment 2- Meaningless words

Can we give words less power than it currently has especially if the goal is to create harm, havoc and deceive people?

How does it work?

As a Mis/Disinformed post goes viral the post starts to distort, sentences slowly start to become words, and the world slowly separates to become letters, to the point that it no longer looks like original post even though it is.

This will enable people to understand that the image they might be looking at maybe problematic or that even enable them to ignore. Ultimately this ignorance would also give lesser power and importance to people posting such information

Information is only powerful when there are meaning and emotion attached to it but if we strip it of its meaning then we take away its power.

Conclusion

The idea of these explorations is not to provide a concrete solution for a complex topic, rather open up and provoke conversations of how speech and information are shared online, what are the consequences that need to develop if people in the society can get hurt.

Does freedom of speech mean you have to be entitled to popularity and publicity by enabling these ideas to go viral? As any publicity good or bad is still publicity and maybe that’s the power that misinformation shouldn’t have.

What questions do these explorations raise for you?

--

--