Case Study: Facebook

In this series of case studies I am looking at widely known companies in the tech industry. In two posts I already explored Slack and Airbnb, to understand the values that underlie each company, as well as the dynamics that they enforce. Take a look at the intro to the first case study if you want more insight into my approach.

Keks Ackerman
Future Sensor
10 min readOct 10, 2019

--

Social Networks

Two adaptations of George Eliot’s novel Middlemarch, which explores the complexity of social networks. On the left, Middlemarch: The Series, photograph by Rebecca Shoptaw. On the right, an illustration in the 1910 edition published by the Jenson Society.

Social networks are not a new idea. They have existed as long as humans have lived together. It is through our social networks that we grow and mature, testing out what kind of behaviour works and what doesn’t. The small dramas that unravel within these webs teach us about judgement, boundaries and self-expression. However, with the birth of digitisation these networks have changed. Facebook, Instagram, Twitter, WeChat (among others) didn’t introduce a totally new concept into human life, but they did drastically expand the ground on which our dramas unfold. Our thoughts, opinions, gossip, pictures, and videos are no longer seen by those who are close to us, but can be accessed globally. The sphere in which we used to “test” our growth is so large that we are often removed from the consequences of our words and actions. We don’t necessarily learn from our mistakes because we aren’t aware of them as mistakes in the first place.

However, these companies have done something more than simply expand the network, they have also changed some of the fundamental rules of how it works — and that is what I want to understand in this post. Mainly, I want to talk about how a social network changes when it becomes a social software.

But before I go into this, let’s talk about the company that started it all: Facebook.

The Impulse

Mark Zuckerberg, right, with Facebook co-founder Dustin Moskovitz, at Harvard Yard in 2004. Justine Hunt | The Boston Globe via Getty Images

Facebook was developed in 2003 by a college kid named Mark Zuckerberg, and the story of its creation is pretty wild. Unlike most companies, the original Facebook, FaceMash, wasn’t conceived as a business idea. It was a sort of game to compare photos of girls and comment on which one was hotter. This story, made famous by the movie The Social Network, has been discussed in great length in every corner of the internet, and though it is fascinating, I don’t think there is much I need to add here.

What I will say is that the nuances of this beginning have an interesting reflection on the product itself, that is, on the idea of a social network. Though a social network connects us, and many would say that this is its primary function, it can accomplish this connection through many different means. Connection can be formed through love, sharing, friendship, but also through jealousy, complaining, antagonism, and shame. So though many would say the original FaceMash was a misogynistic disgrace, it was also a very effective way to connect a specific group of people.

Manifestation

After it’s controversial introduction, Facebook launched as an internal social network for Harvard in 2004. Two years later, after multiple large investments and lawsuits, the platform opened to everyone.

By 2012, Facebook was the first social network to surpass 1 billion people, and today (2019), the platform has about 2.38 billion monthly users. On top of that, Facebook has estimated that at least 2.1 billion people use FB or one of its affiliated apps, like messenger, WhatsApp, or Instagram every day. There are 7.7 billion people on the planet, which means that almost a ⅓ of the world is on Facebook.

This means that Facebook’s user base is larger than any nation. I highlight this point to impress the power that Facebook has over what we see, how our opinions are formed, and what we buy. Facebook is not run by elected officials, and its board is not a representative body. So though it’s users constitute a group larger than any nation, there is no democratic network governing it. It is able to function with very little accountability.

Time Well Spent?

Photo by Djim Loic on Unsplash

Facebook (or one of its affiliates) is one of the primary places where we connect with family and friends. However, just like in many previous versions of social networks, it is also where we get our news, our political ideologies, advice about how to live, conspiracy theories, humour, and anything else you can think of.

Rather than making us feel more aware and connected however, our FB and IG feeds often leave us feeling unfulfilled and vague, as if our values are not really being met. Mark Zuckerberg said he wanted Facebook to be ‘time well spent’, a phrase coined by Tristan Harris and Joe Edelman in relation to the growing dissatisfaction people experience on-line. But in reality, Facebook has done little to align itself with the values and intentions of its users. The company has released statements about its mission, and updated privacy requirements, but it hasn’t fundamentally changed the software or the business model which profits not from time well spent, but from the quantity of time spent.

The Social Software

Photo by Markus Spiske on Unsplash

In his essay on the limitations of software, Edelman asks if a platform such as Facebook, which is inherently limited by its structure and design, can ever allow users to express their values. Does it enable people to learn in a social context, reflect on mistakes, test unusual modes of behaviour, or in other words, do all of the things that make a thoughtful adult?

Compared to past social systems — governed by social conventions or laws — software gives less space for personal reinterpretation or disobedience. It tends to code up exactly how we are intended to interact.

Edelman suggests that social software, and Facebook in particular, struggles with the challenge of being unalterable. Though there are examples of users taking a digital service and bending it to their needs and desires (the #Twitter being a good example), the very nature of software makes experimentation much more difficult. In an offline, real-world environment people can easily disobey and re-structure norms to express an inner change, or a need to try something weird and different. On Facebook, you are locked into the particular aesthetic and structure of the software. You can’t graffiti your profile, or get drunk in the parking lot of Walmarts homepage.

Further, the design decisions dictate in what manner you interact with people, which Edelman argues is where values come in. He talks about values as being the manner in which you choose to live, in distinction from the goals you might have. For example, you might have the goal to take on more responsibility at work, and hope that this gives you more recognition, compensation, or meaningful projects. But you also want to make this change in a certain way. These ways of being can express a person’s values much more clearly than the goals themselves.

On social media, the structure of the software often deprives us from determining our manner or way of being for ourselves. Thus, it can take away a very good opportunity for growth and personal reflection. Rather than consider and prioritise our own values, we are left to notice things which are put in front of us by an attention obsessed algorithm.

The coded structure of push notifications makes it harder to prioritise a value of personal focus; the coded structure of likes makes it harder to prioritise not relying on others’ opinions; and similar structures interfere with other values, like being honest or kind to people, being thoughtful, etc.

Edelman suggests that if social media platforms enabled users to express their values more fully we would have less trouble regulating fake news, bots, hate speech, and all of the other negative qualities we see perpetuated on FB. When structures are pre-determined for us, it is much easier to fall into ideological lenses because we are not aware of other options. This is not to say that the offline world avoids the problem of siloed information or heavily structured norms, but it is never absolutely fixed.

Digital Dynamics

The Metamodernism of Design, Jordan Lee

Though I have been speaking about the structure of Facebook’s software, and how this limits self-expression and experimentation, Facebook is also a place where information is shared and re-configured constantly. And though the FB’s feed does not always positively impact our wellbeing (anthropologist Daniel Miller provides an example where it has hugely positive effects), it has profoundly shaped our experience of social interactions.

The meme-drift, for example, shows how irony and humour work their way through the internet, often using FB or IG as a medium. This phenomenon helps express the many different ways that Facebook has contributed to the digital dynamics which now characterise the internet. As one of the earliest, and certainly the most successful social media platforms, it has significantly shaped the way in which we interact and move through digital environments. It has formed our expectations of what social media is and can do. It has opened up a huge global community, while at the same time, due to its very ubiquitousness, it has confined our imagination of what a better design might look like. For example, it is hard for many people to imagine what a software designed to promote values and wisdom would look like because we are so accustomed to the ‘like-based’ structure of Facebook, Instagram, and Twitter.

The Future

Osaka Skyline, A still from the short film Spatial Bodies, by AUJIK

At the very core of Facebook’s proclaimed mission is the goal to create more connection and understanding. However, due to strategies of deep surveillance, which at this point are inseparable from Facebook’s ad based model, the company tends to succeed at something much different.

Instead of appealing to our higher brain functions, says Zeynep Tufeci, they (FB’s algorithms) speak to our lower brains. Many consequences follow: the proliferation of filter bubbles and a tendency to present us with more and more extremist content…which algorithms have learned leads to more viewing time.

Though the idea of a modern social network is to create connections across the world, and if done well is one of the best effects of digitisation, social networks can also create nests of extreme and increasingly crude behaviour. This behaviour is unchecked by the obvious humanity of another person sitting beside you. This critique of Facebook is not to suggest that digitised social spaces inherently tend toward extremism (racism, hate speech, sexism, etc) but that a social network whose shadow goal is advertisement will never succeed at a different effect.

If we spend time imagining alternate futures, even futures that are fictional or absurd, we might find that our understanding of a social space is much deeper and broader than what Facebook currently offers. We might look into social gatherings hosted through video games, or decide that we don’t want a page listing all the events happening in our city, but rather a spinning wheel that picks events based on our mood. Maybe a virtual space designed to share values and wisdom wouldn’t have comment threads but rather bubbles of sound with real people’s voices.

I don’t know what you want, or how it can be realised, but it is precisely this reflection and speculation which can help us thrive in the future. As we move into a world dominated by augmented reality and AI, we must wonder about how these technologies sit with our values. Do they disturb our values, or perhaps force them to evolve? We must also remember that though the future might be bright and shiny, with lots of cool technology that enhances our senses and allows us to forgo boring tasks, we are not there yet. We have to create the path to this future, and hopefully do so in a way that is inclusive, fair, and thoughtful.

Questions for the Future:

  • What does privacy mean to different people?
  • How can you design a software to enable value-based sharing, and provoke value-based discussion?
  • What particular features on Facebook (or other social networks) inspire feelings of anxiety, overwhelm, obsession, self-deprecation etc.? Can we re-think these features, such as excessive comment threads or likes to promote a more healthy experience?
  • How can information silos be broken?
  • Can there be more transparency between the algorithm and the user, so that the user understands why certain information is being shown? We would suggest there be more flexibility in these algorithms so that people can decide for themselves how their information is filtered.
  • How can one create a social space without imposing personal or corporate values?

Facebook on the AQAL

As far as necessary, all rights to the images used here have been clarified with the artists or producers. For some images I have paid a small amount of money. For others I made a donation to non-profit projects in agreement with the artists. However, most of the creators agreed that their works are used here free of charge. I’d like to express my gratitude to all of them.

--

--

Keks Ackerman
Future Sensor

Keks Ackerman is a metamodern writer, and entrepreneur, building a systemically healthy society and economy.