By 4edges — Own work, CC BY-SA 4.0

The Sea of Shadows

Bahadir
transistorphilosopher
17 min readDec 28, 2021

--

A subjective study on the effects of algorithms on the self

To what extent do algorithms affect the construction of the self, and how do we get addicted to algorithms?

Perhaps because of the abundance of dystopian masterpieces among works of science fiction, it has started to become tedious for me to see authors lose themselves, or sometimes even drown within dystopian representations in many texts, especially about robots or autonomous systems. I think that this is a consequence of the fact that the human desire to create conspiracy theories cannot be suppressed. Many blog posts, popular science articles, videos, podcasts etc. can be found to support this notion. Most of them are well-executed, convincing and rational. In general, the main objective of this type of content is to provide evidence supporting the plausibility of the dystopia ultimately presented, and to show that it could happen in the near future. There is nothing wrong with this, of course, but this type of content leaves one with the lingering feeling that the dystopian representation was not created in light of collected evidence, but that the representation came first, and supporting evidence was collected later — and maybe some were intentionally left out. While this lingering feeling can be quite pleasant in literary works, when it comes to content that strives to deliver opinions, takes and views on scientific, technical and technological issues, it comes off as artificial at best. No matter how convincing, scientific and rational it may be.

Therefore, in the name of standing out a little more, I will intentionally not delve into “controlled social engineering”, even though I will dance around it. Because this topic is very well suited to lose yourself in analyses and predictions built on “evil corps”.

Assimilation Marathon

Right here, it feels appropriate to touch upon this desire to stand out. Because I feel that this desire to stand out or be distinctive is a concept that is approaching its expiration date with every passing day. Even though I’m not at an age or maturity level to draw a comparison between decades, my peers (’80s kids) will confirm that while individuals in our social circles boasted their distinctiveness, nowadays people are in a race to blend in with all their might. You don’t have to be a senior citizen to reach this conclusion. It’s enough to delve into the concept of FoMO (Fear of Missing Out, Herman, Dan. 2000) a little.

An article published in 2014 (Matrix, Sidneyeve 2014, 128–129) tells the story of a young woman who binge-watched 5 seasons of Breaking Bad in one day, just not to be left out of the fandom. This brings to mind the new Netflix feature of watching at 1.5x speed, which must have been introduced for people who want to have watched the show rather than enjoy it.

It is obviously not a new behavior to strive to be someone who embodies the culture of a certain minority, in order to be accepted by said minority. However, what I want to emphasize here is not this behavior, but my view that these minorities are disappearing within techno-social systems, having lost their characteristics within a single majority. Because lately, I have been seeing far too many of those who are worn down by their inability to catch up with the mainstream, which has achieved a terrific flow and motion thanks to technology. In a way, it is technology itself that reveals this scenery. The pandemic that has been the unescapable reality of the last two years, and what has undoubtedly deserved most to be in the public eye, has acted as a litmus test for this phenomenon. Throughout this time of information diarrhea from every platform that has declared itself a news source, we have all been exhausted in our daily struggle to pick out valid information on this vital issue, and thus have some sense of our circumstances. Everyone I know, including myself, has grown tired of this struggle.

I and a lot of people around me have invented new pastimes and hobbies recently, thinking that the way to alleviate mental fatigue is to stay away from things that tire the mind, just as the way to alleviate physical fatigue is to lay down and rest. I have started reading more, but instead of looking around and thinking “What is everyone reading?”, I set myself on a mission to witness the beginning and end of fantasy literature — one of my favorite genres — by reading the best samples. On this interesting quest, I had the privilege of getting to know George McDonald, William Morris, Lord Dunsany and Patrick Rothfuss, a contemporary of ours. Video games are another area that is of great value to me. Instead of forcing myself to like whatever everybody likes and plays, I have always strived to find the games that appeal the most to me, even when they are in the most obscure corners — and that’s what I’ve done this time around. I have even started developing my own games because I couldn’t find exactly what I had in mind. When I had any time left, or when I had no energy for anything else, I also consumed content on on-demand streaming platforms like everybody else. There, too, instead of following up on the “most watched” list, I always tried to pick out something that was either similar to the content I had previously enjoyed, or something different that I had no idea about — but I wasn’t very successful. This was not only my experience — those around me and those I deem worthy of following on social media did the same, I know. We have messed around in accordance with our own whims, in the cool shadows of our own personalities.

But it seems that the situation was very different outside of the social circles that I am exposed to. I want to talk about this by mentioning a few situations that stood out to me, concerning the realms of literature, cinema and TV, and video games, which I engulfed myself in during the course of the pandemic, which, as I will later emphasize, has acted as a litmus test. During the pandemic, I started seeing books that were only bought because their themes were outbreaks, pandemics and viruses, in addition to best-sellers like Dean Kontz — The Eyes of Darkness and Sylvia Brown — End of Days, the authors of which were forcefully declared oracles. We all saw movies and TV shows become popular and appear on Netflix’ most watched list only because they were based on viruses, even when they weren’t documentaries. On Steam, the 2012 game Plague Inc. and the 2014 game Pandemic: The Board Game, which were pandemic-themed, both created the most revenue during this time and become two of the most-played games.

Plague Inc. — steamcharts
Pandemic — steamcharts

As these charts confirm, only a minority of me and a few of my acquaintances have evidently been lucky enough to have the freedom to release ourselves from mental fatigue. In fact, a majority existed who, rather than discharge their mental fatigue and saturation, wanted to feel every moment of the mainstream in their veins, feed off the information-diarrhea-ridden sources without stopping to breathe, and even when they had time for hobbies, leisure, in other words for themselves, chose to fill that with something related to the current events. This is a majority that aspires to be as exposed as possible to popular topics. A majority who are in a race to drink in the popular topics, not one glass at a time, but by the gallon, and when they cannot drink any more, to bathe in it, to tie a rock to their feet and throw themselves into the sea of popular topics. The desire to be in the center of the mainstream, the struggle to be as like everyone else as possible, to break the record in the marathon for assimilation, to reach the record.

When the poet said, “I think now you are like everybody else,” he likened being like everybody else to “a non-magical breath” and “a voice with faded echoes,” and set it as the precondition of being forgotten. To be easily forgotten, in other words to have no trait left worth remembering, has nowadays become a goal to run after until you throw up.

The Absolute Character and the Sea of Shadows

And who is this everyone, that everyone’s trying to become? Is it that we are actually nobody, but we construct that “everybody” with the roles that are designated to us? Are they the figures believed to live behind the storefronts of social media by the victims of FoMO as they enviously gaze through them? Who is this absolute character, that we always approach but never truly become, that has turned into a sort of ideal self? Who created this character? That last question is a trick question that will lead us down a rabbit hole of conspiracy, so I will not venture there. In fact, I will not do more than ask these questions of absolute character, since I also do not know the answer, at least for now. In my opinion, if there is a question that is more important than who this ultimate character is and who designed it — and I don’t believe it has been created with much intention — it is this: Why do we feel compelled to chase after this character?

I guess it would be in bad taste to talk about concepts like the ideal self and chasing after it, and not mention Maslow. In the well-known hierarchy of needs, Maslow placed self-actualization as the ultimate goal of a human life (Maslow, 1943). On the other hand, the renowned psychoanalytic theorist Karen Horney, whose arguments make up a significant portion of the backbone of this article, distinguishes between the real self, which denotes our current circumstances, and the ideal self, which constitutes a model for our real self, defining who we would want to be. These two important theories make clear that the human desire to actualize one’s ideal self is in our nature. Even though it is possible to derive some satisfaction from the lower steps of this hierarchy, humankind cannot help but dedicate a lifetime for this last step. From the point of view of a conspiracy theorist, we could say: However the ideal self of person is defined, that is what they will live for.

And how do we define our ideal self? If we are going to spend our lives chasing after it, we should at least be able to define it. Who is the person that we want to be? If we were to ask ourselves now, would we know the answer? This is not as simple a question as “Where do you see yourself in 5 years?”, which we might answer during a job interview. What are the characteristics of our ideal self, how does it act, how does it react? Is it more emotional than our real self, or more carefree, more rational? How far are we from our ideal self? And how far are we from our real self from 5 years ago?

I will assume that we can all have some kind of an answer — satisfying or not — for these questions, and move on to the next question: To what extent have we used the building blocks obtained from social media platforms, the impenetrable fortresses of algorithms, in constructing these answers? In other words, how affected are social media platforms in shaping our selves? There are many studies that have probed into this question. In a thesis published in 2016 (Isaranon, 2016), the results a study in Thailand provided evidence supporting the notion that Facebook might serve as a platform for social narcissists to feel the best on and to come closer to their ideals. The study also showed that the need for self-respect (the step that comes right before self-actualization in the needs hierarchy) partially affected this process and Instagram, in addition to Facebook, could be used to address the need to confirm one’s ideal self.

Many studies as the one cited above can be found in literature with search terms such as “social networks and the ideal self”, but I think it’s unnecessary to put much effort into proving how big of a role algorithms that we interact with even more than those physically around us play in our ideal self. We may put more sentences into search engines daily than the ones that we utter. It may be that most of us more often have to explain ourselves not to another human being, but to search engines that are a lot less capable of understanding. We cannot only think in the terms of search engines — any time we pick up our phones, turn on our computers, probably even any time we plug in any device that has a transistor in it, we end up having to explain ourselves to an algorithm. There are so many of these devices that I’m sure for most of us the time we are not exposed to an algorithm make up a smaller portion of the day. In fact, worse than being “exposed” to an algorithm, we are variables in it. Even if it is to execute small functions, it takes us inside it, makes us into a piece of the algorithm and forces us to comply with it. We are having to explains ourselves more often to this algorithm than to real people, and as if that’s not enough, we constantly play by its rules. This may be something insignificant like warming something up in the microwave or turning up the thermostat to heat a room, each one might make up only an insignificant moment of a day, but they are so prolific that perhaps the time we spend without complying with the commands of a premade algorithm is almost nonexistent.

Not every algorithm we are exposed to is restricted in the realm of “tools”. For a person who makes a living as a Youtuber, Youtube’s recommendation algorithm is now a purpose. They have dedicated their life to the commands of that algorithm, since they have to in order to climb even the first steps of the needs hierarchy. For those whose standard of living depends on the revenue, the search engine algorithms, or for those whose income largely depends on sales on Instagram, recommendation algorithms have established similar hegemonies on their lives.

Comparable with algorithms that we are exposed to in order to meet our basic needs or earn a living, algorithms also run to our aid to meet our need for entertainment and aesthetic pleasure. There are many approaches to developing these recommendation systems, but as someone who writes, uses and manages these algorithms, I can say that ultimately most of these approaches evaluate you based on those like you, in other words, they generalize you. For instance, a popular approach might work in this way: they operate on the premise that it is highly probable that you will like something that other users who have liked the same things as you have liked, and you have not yet consumed, and recommend that content to you. They do the same thing for the other users and over time your tastes and the tastes of those who have consumed similar content becomes one.

For instance, right now as I’m writing this and listening to music, on a distant server, a recommendation system is working to decide for me what song I will listen to next. I don’t know what the next song is going to be, and as many of us do, I will listen to whatever it plays for me. Because I am buying the luxury to not bother to create a playlist like I used to do. Compared with a playlist that I would create, the list that the recommendation system creates for me is less repetitive. Though relatively little, it still plays the same stuff more than once and I find myself playing those songs in my mind later, even if I don’t particularly like them. While it is a very comfortable way to make new discoveries, sometimes I find myself enjoying something that was dictated on me and is not really my cup of tea. It reminds me of the days when the metalheads who came down to the beaches on vacation and having been exposed to pop music repeatedly, would inadvertently start enjoying a song that is now playing for the 400th time.

Having similar tastes with another person is not an indicator of making similar choices in the future. I am sure that you know many who are completely unlike you, despite sharing some of your tastes. You might have walked together down the same path in the land of aesthetics, but recommendation algorithms might be taking away your freedom to follow different paths come morning.

The movies and TV shows we watch, the books we read, the games we play, and many more that exist on similar platforms, work to create a channel of consumption based on choices. In the sea of ever-extending possibilities, the ships navigated by algorithms sail into the vastness they think to be out of reach for us and bring back what they caught to us on the shore, exhausted of laziness. However well-meaning the captain of this ship might be in presenting us what it thinks we might like, however much it thinks that it knows us, it is not us who are the captains of the ships. We only know the sea by what the captain has brought us. What the algorithms present to us reflect the world out there as much as the shadows on the wall in Plato’s familiar allegory of the cave.

Art illustration by Tell Your Children display at Esplanade Tunnel — photo by Choo Yut Shing

We see that algorithms may have an effective role in designing our ideal selves. There is no doubt that social media platforms are where we are most exposed to these algorithms. Then, how do we get so addicted to social media that we give it the power to interfere in the design of our ideal self? Do we allow this, or are we unable to stop it? Why aren’t all social media users addicted to this extent? Why do the addicts have a hard time breaking free?

Karen Horney, who has developed one of the most familiar definitions of neurosis, says that the self of the neurotic is divided between the real self and the idealized self, which leads to them feeling that they are not living in accordance with their ideal selves. These people oscillate between hatred for their real selves and a false sense of perfectionism directed at their ideal selves (Horney, 1950). Horney, who has asserted 70 years ago that a falsely constructed ideal self could have these kinds of neurotic consequences, stated that coping mechanisms called upon to address the “anxiety” that is a symptom of neurosis can be overused and be misconstrued as needs. In 1945, she listed 10 neurotic needs, which can be organized under 3 main topics (Verywell Mind, 2019):

  • Needs that move you towards others, which are needs that cause individuals to seek affirmation and acceptance from others and lead to them being described as needy or clingy.
  • Needs that move you away from others, which are needs that create hostility and antisocial and lead to them being described as cold, indifferent, and aloof.
  • Needs that move you against others, which are needs that result in hostility and a need to control other people and lead to them being described as difficult, domineering, and unkind.

It shouldn’t be hard to guess how effective a place social media could hold, especially in satisfying the first group of needs. Social media platforms that people turn to in a hope for affirmation and acceptance, sometimes — or maybe usually — may produce rejection and exclusion as well. In those moments, it is inevitable that the relationship with the platform turns into a positive feedback loop that exacerbates anxiety and depression, rather than help cope with them.

Art illustration by Tell Your Children display at Esplanade Tunnel — Choo Yut Shing

Many cross-sectional studies exist that verify this correlation between social media, and anxiety and depression (Lin et al.,2016). For instance, a 2014 the results of a test that was carried out with 1787 participants between the ages of 19 to 32 showed that social media use strongly correlated with depression, and another 2013 study on 447 Turkish students concluded that severe depression, anxiety and insomnia correlated with Facebook addiction (Koç & Gülyağcı 2013).

Dehumanization: The Robotized Human

So far, we have seen that algorithms play an effective and molding role in answering our vital needs, in our social life, in our sense of entertainment and aesthetics, and even in our neuroses. We are talking about algorithms marketed with the promise that they have been designed for us but end up dictating the rules of life for most people. Then our oracles come forth, painting pictures of distant futures where artificial intelligence takes over the earth. This concept that has boiled down to two words, AI Takeover, and have led to the spending of endless hours in the world of literature, film and games, can be roughly summarized as “artificial intelligence becoming the dominant intelligence in the world.” I’m sure that every possible prophecy on this has been made, every analysis and breakdown has been presented. What I want to emphasize, however, is an approach that has been written about relatively less. This approach is based on the argument that rather than AI developing to the level of human intelligence and taking over the world, humanity will deliver the world to AI by our own hands at the end of an insidious process, through the regression of human intelligence to the level of artificial intelligence.

Detail of the ceiling of the Sistine Chapel — jaci XIII

I will publish the second part of this article that will build upon this subject, but first I want to preface that part here. First, as I have done so far in this article, I will do my best to separate all the arguments I present in the second part of the article from my own subjective views and support them with studies from scientific literature. Otherwise, the article will not go beyond literary Cyberpunk and dystopian sci-fi content and even though I pride myself as a reader, as a writer it is something that I would neither presume nor aspire to do. I will continue staying in this lane and look for the answer to the following questions in the second part:

  • While we watch robots become more humanized, is it possible that at the same time the technological systems we are exposed to are robotizing us?
  • Is it possible that the point that robots are expected to reach is actually moving toward them at a much higher speed than their rate of advancement?
  • Could it be that we humans are becoming robotized faster that robots are becoming humanized?
  • How likely is it, that before we see a robot pass the Turing test, we see a human fail it?

As I have mentioned before, this subject has been studied relatively less. But I want to indicate in advance that part two will be critically influenced by the works of Brett Frischmann, who is a professor of business and economics in Villanova University and academic associated with the Stanford Law School Center of Internet and Society.

References

  • Matrix, Sidneyeve. “The Netflix effect: Teens, binge watching, and on-demand digital media trends.” Jeunesse: Young People, Texts, Cultures 6.1 (2014): 119–138.
  • Herman, Dan. “Introducing short-term brands: A new branding tool for a new consumer reality.” Journal of Brand Management 7.5 (2000): 330–340.
  • “Karen Horney’s Theory of Neurotic Needs.” Verywell Mind, 26 November 2019, http://verywellmind.com/horneys-list-of-neurotic-needs-2795949. Accessed 18 November 2021.
  • Lin, Liu Yi, et al. “Association between social media use and depression among US young adults.” Depression and anxiety 33.4 (2016): 323–331.
  • Oberst, Ursula, et al. “Negative consequences from heavy social networking in adolescents: The mediating role of fear of missing out.” Journal of adolescence 55 (2017): 51–60.
  • Koc, Mustafa, and Seval Gulyagci. “Facebook addiction among Turkish college students: The role of psychological health, demographic, and usage characteristics.” Cyberpsychology, Behavior, and Social Networking 16.4 (2013): 279–284.
  • Maslow, Abraham Harold. “A theory of human motivation.” Psychological review 50.4 (1943): 370.
  • Horney, Karen “​​Neurosis and Human Growth” (1950)
  • Isaranon, Yokfah. Narcissism and affirmation of the ideal self on social media in Thailand. Diss. Goldsmiths, University of London, 2016.

--

--