Veeresh’s Preface: A High Level Overview of Misinformation and Social Media Addiction

Veeresh Neralagi
The Misinformation Project
5 min readJan 14, 2021

I despise the sad fact nowadays that information, whether opinionated or scientific, is looked at as an attack on one’s personal beliefs. With the culprit obviously being social media platforms, I think it is important to discuss the methods and means in which such an issue is growing larger and larger every day.

Deny it or don’t, but every notification, every comment on your picture, every like on your post, and every feed refresh has you drooling for more. As they say, ‘ignorance is bliss’. While we think we are “aware” of our social media addictions, we completely disregard the severity of it. Besides this addiction, another problem is how easily we are misled with information and sent down a single narrative path. We are oblivious to how information is portrayed to us on social media, such as post and user recommendations, different types of ads, etc.

A 5 minute video of Chamath Palihapitiya, a former Growth Executive at Facebook, talking about the ways in which Social Media is ‘ripping society apart’ and how users are continuously being ‘programmed’.

There are a multitude of aspects I want to discuss about at a high level, but to start, I think it is important to understand what takes place behind the scenes and how these platforms are engineered, both from a micro and macro perspective.

As an engineer at any given social media platform, your developments are driven by user engagement. Certain strategies to increase user engagement include timed notifications, timely non-disruptive ads, and a seamless onboarding process. Now, with the assistance of state of the art AI and Machine Learning, any given platform is able to recognize patterns of user activity and engagement, which then tells the platform how and when to send notifications/alerts and timed ads.

In this end to end cycle of the user and engineer, the user believes that they are willingly spending countless hours on a platform simply because they choose to, but in reality, the user’s behaviors are being controlled, whether they want to admit it or not.

We use our phones the moment we wake up, in the bathroom, while eating, during work, at the gym or while working out, while driving– this list just goes on and on. The average adult in the US in 2019 spent 3 hours and 43 minutes a day on their phone, and last year itself, there was an increase of 6 minutes in average phone time per day bordering on half a work day consumed by your cell phone [1].

Looking from the engineer’s side, perhaps the most valuable question to ask and answer would be ‘how much time and attention from a user during a single session on their phone/device can be attained and optimized?’ To solve this, it would be important to first look at the average attention span of human beings in general and to also study what types of content and visuals increase human engagement.

From a study at Microsoft in 2015 [2], it was discovered that compared to a goldfish with an average attention span of 9 seconds, humans, as of 2013, have only an 8 second attention span (a 4 second drop from 2000). This gives engineers a tiny window to deliver the most user engaging content as possible before the next 8 seconds of attention start, which forces the engineers to create an automated solution for this endless recurring cycle. The scary part is that with the power of AI and Machine Learning, this process can not only be automated, but improved upon. It’s as if someone is behind you taking notes during your phone sessions and engineering a more potent solution to increase your session time. Within this process, the factor that either prolongs or ends a user’s session is the content that is shown. Usually, this content is chosen based on a user’s previous activity and how long they view certain content. In addition, the sex, age, location of a user, and demographics of whom a user follows also dictate the type of content that is shown. There are a plentiful of factors that shape a content-suggesting algorithm, but I think it’s way more important to discuss the consequences of how that content is chosen.

The article “How Facebook Stole Your Psychological Profile” by Susan Krauss Whitbourne Ph.D. [3], had this quote that really concerned me: ‘people of higher status (or at least those living in high GDP countries), are more likely to have outgroup biases, greater anxiety about people from groups other than their own, and higher levels of prejudice.’ Can you believe this irony? The people who are more educated and informed in this world are also the most biased and insecure? Clearly, knowledge isn’t everything. There’s more to it- how information is portrayed, and to whom. Now that I think about it, it is not a surprise to me that the Capitol recently got breached, or that fake news spread through mediums such as WhatsApp contributed to multiple people getting lynched in India in 2018. I’m not letting myself off the hook either- I am also a victim of having succumbed to these algorithms and the one sided narrative that I was presented with throughout a majority of my lifetime on social media.

A world driven with a single narrative is a world driven with ignorance.

Before the technological revolution, humans from all parts of the planet had different beliefs and perceptions on the world based on their cultures and ideals. People traveled from continent to continent, discussed their differences, innovated, and learned more about the world. The issue with social media is that during a time in human history where everyone on this planet- regardless of what part of the hemisphere they are on- is interconnected through technology and more about the earth is known than ever before, there is still a large tendency of these platforms to immediately divide groups of people and send them down a single narrative path. The result of this is catastrophic, as people nowadays tend to only choose sides and constantly humiliate those with different opinions.

Solving this issue is not an easy thing to do whatsoever. The psychological effects that social media has had on us for the last decade is almost irreversible. Danah Boyd from Microsoft Research puts it perfectly as she stated “brains are being rewired — any shift in stimuli results in a rewiring… the techniques and mechanisms to engage in rapid-fire attention shifting will be extremely useful” [2]. But how those techniques and mechanisms are implemented without being too invasive is the real question.

Humanity has always risen to solve external problems, whether through a renaissance of technology and art, sensational leadership, or through just pure hope and belief. For the deadly problems of misinformation and social media addiction, it will take more than just a computer algorithm to solve them. Rather, it will take a collective world effort to incentivize human progress again.

Links/Sources

  1. https://elitecontentmarketer.com/screen-time-statistics/#:~:text=A%20study%20of%2011k%20RescueTime,minutes%20on%20their%20mobile%20devices.
  2. https://dl.motamem.org/microsoft-attention-spans-research-report.pdf
  3. https://www.psychologytoday.com/us/blog/fulfillment-any-age/201804/how-facebook-stole-your-psychological-profile

--

--