Facebook has created billions of comfort cocoons

Algorithms That Surprize Us

How to rethink algorithms to produce more diversity?

Published in
9 min readJun 8, 2016

--

By Hossein Derakhshan

A few months after the Paris attacks by Isis terrorists, Sheryl Sandberg, a senior manager at Facebook, sat down at a warmly-lit gray and blue stage at a conference hall in snow-covered Davos. The ski-resort in Switzerland is now more famous for its annual gathering of men (and some women) in business suits rather than in skiing boots; it is where the World Economic Forum is held annually.

On a panel titled ‘The transformation of tomorrow’ she sat down with a few other businessmen and discussed the hot topic of the day: How can Facebook and other social networks deprive Daesh/Isis of their vast reach for spreading ideology and recruiting new members?

Twitter and Facebook has faced an uphill battle in shutting down the thousands of accounts belonging to Isis and its sympathizers. At Davos, Sandberg admitted the futility of trying to keep up; she did, however, come up with an interesting idea: Like-attacks.

She drew from a recent Facebook campaign in Germany in which Facebook pages of a racist, far-right political party, National Democratic Party (NDP), were targeted by activists and users who were encouraged to challenge racist and hateful content with critical postings and comments and liking them.

“What was a page filled with hatred and intolerance was then tolerance and messages of hope. Counter-speech to the speech that is perpetuating hate we think by far is the best answer,” said Sandberg.

A smart answer, but is she right? Can a different behaviour by users of social networks weaken extremists?

A few weeks later, New York Time’s columnist Thomas L. Friedman, raised a similar issue with a different touch. Exploring the reasons for the ultimate failure of the once-hyped ‘Arab Spring’, he asked if “social media is better at breaking things than at making things?”

The problem is not the nature of social media, but their present architecture. To create a different world, we need different values, rules, and ultimately different algorithms.

Friedman then quoted Wael Ghonim, an Egyptian online activist who played a major role in uprisings that led to the fall of Mubarak:

“First, we don’t know how to deal with rumours. Rumours that confirm people’s biases are now believed and spread among millions of people.” Second, “We tend to only communicate with people that we agree with, and thanks to social media, we can mute, un-follow and block everybody else. Third, online discussions quickly descend into angry mobs. … It’s as if we forget that the people behind screens are actually real people and not just avatars.

“And fourth, it became really hard to change our opinions. Because of the speed and brevity of social media, we are forced to jump to conclusions and write sharp opinions in 140 characters about complex world affairs. And once we do that, it lives forever on the internet.”

Fifth, and most crucial, he said, “today, our social media experiences are designed in a way that favors broadcasting over engagements, posts over discussions, shallow comments over deep conversations. … It’s as if we agreed that we are here to talk at each other instead of talking with each other.”

Friedman and Ghonim are onto something different from Sandberg. Their question, while valid and helpful, is about the nature of social networks, as if there is only one way of organizing a collectively-produced and massive amount of information on social networks.

Influence behind the scenes

The debate around social media influence on its users was taken to the next level early in May when former employees at Facebook revealed to Gizmodo that they manipulated News Feed output and the trending topics to suppress conservative news. They said they sometimes removed some trending topics, and injected some at other times.

This set the conservatives on fire. The Senate Commerce Committee wrote a protest letter to Facebook. Quickly, Mark Zuckerberg invited seventeen senior conservative figures such as radio host Glenn Beck, American Enterprise Institute President Arthur Brooks, Tea Party Patriots CEO Jenny Beth Martin, Brent Bozell, president of the Media Research Center, Dana Perino of Fox News, and Senior Donald Trump campaign aid Barry Bennett, to reassure them about his commitment to political neutrality. The meeting was so important to him, that when it ran over time, the young billionaire postponed attending to the affairs of his gigantic digital empire.

The outcome of the usual one-week news cycle was a series of articles and opinion pieces filled with the word ‘Algorithm’. A concept named after the medieval Iranian mathematician, Al-Kharazmi, who lived at the height of the Islamic civilization.

Ironically, our growing fear of the ‘Others’ passing our borders has coincided with our rising need to be comforted in our silk cocoons.

Algorithm is essentially an attempt to simulate the way we think the human mind works. A step-to-step rational decision-making process which produces a certain and simple output from numerous irrational, orderless inputs. Algorithm is a way we want to make our chaotic world tangibly simple — and at the same time to cut down the costs of expensive human work force.

The discussion then became about algorithms and neutrality. There are those, like Zuckerberg, who think Facebook output is neutral, because it is produced by algorithms. And others like Zeynep Tufekci, a US-based Turkish sociologist who studies the mutual impact of technology and society, who thinks algorithms by nature have an ingrained bias, because they are created by biased human beings and “they optimize output to parameters the company chooses, crucially, under conditions also shaped by the company.”

The truth is social network’s algorithms are filters that say ‘Yes’ to some of the stuff we share, and ‘No’ to others, and these filters follow certain rules. Therefore, there are theoretically as many types of algorithms as there are rules.

What is curiously missing from the general debate is this question: Is it possible to imagine other types of algorithms that produce balanced news, have radicals challenged, and help productive debates? My answer is a big yes.

Rules follow values. To imagine different rules (and thereby different algorithms), we should discover the underlying existing values based on which the dominant Facebook algorithms work, and then think of alternatives.

The age of consumption

Born and raised in the Middle East, what has always struck me in my visits to Europe or North America is how much the culture is dominated by the images of young people and celebrities. Everywhere you look, from print to television, to billboards and advertising, images of fresh young faces with sparkling eyes remind anyone above forty-five (yes, I set the bar higher so I don’t feel old myself) that they are soon expiring.

I’ve always wondered how depressing it would be to walk in the streets of New York City or London or Brussels — and sadly many other vibrant cities around the world including Tehran — when I’m sixty-something, and feel like an alien. Would I not rather flee to residential suburbs where I’m not constantly treated as too old and too useless?

‘Algorithm’ is named after the medieval Iranian mathematician, Al-Kharazmi, who lived at the height of the Islamic civilization.

The two major dominant values of the age of consumption, youth and fame, are translated into the new internet as newness and popularity.

In addition to native content for business purposes, Facebook algorithms mostly prioritize what is new and what is popular. If what you post is not freshly produced, or doesn’t receive certain amount of likes and reshares in a certain time (based on the secret rules Facebook sets) your posts won’t be treated as worthy of others people’s gaze. And in this time and age, as Donald Trump knows very well, gaze is money.

This is quite different from pre-social internet when audiences were faceless, anonymous masses whose most active engagement was to occasionally leave a comment. This was an era where browsing the internet was a non-linear experience, thanks to a concept called links. Links provided a break from a linear movement across time and space. They were always capable of surprising us. We all started from a favourite blog or website, and several clicks away, we ended up landing in alien places.

Links provided a break from a linear movement across time and space.

Archives used to be common, where we could quickly access posts from a different age. Even search had a vastly different dynamic to it, because most of the content that was produced was in words rather than images or videos. How can you find a sentence you heard or a scene you watched in a video last year?

Now with the new values and rules that guide algorithms, we are increasingly forced to remain in one space and be fed with what we already agree with. Facebook’s current algorithms want the entire world to remain inside of it all the time and to consume what they most enjoy. Facebook doesn’t like you to be upset or challenged or surprised, because that’s not good for business.

That is the core of the problem everyone is trying to articulate. It takes a UN-based diplomat, Ambassador Samantha Power, whose very job involves constant engagement with people around the world to realise the dangers of isolation.

Comfort cocoons

With current algorithms, views and beliefs of radical religious and nationalist groups, like those of all of us, are being reinforced in personal comfort cocoons Zuckerberg has made for billions of people. Common causes quickly become reasons for bitter divisions, once they are supposed to posit a vision rather than to negate one.

And all this at a time when the West needs to face the awkward consequences of centuries of colonialism and decades of self-serving interventions. Ironically, our growing fear of the ‘Others’ passing our borders has coincided with our rising need to be comforted in our silk cocoons. Old habits die hard, but thanks to social media, nobody wants to kill them anymore.

There are many things that can be done. On a personal level, we should disrupt current algorithms to confuse them in realizing what we want, so they give us more diversity. We can start liking what we actually dislike (or disagree with) on our News Feeds. We can follow people or communities we dislike. We can challenge majorities.

On a corporate level, we need to encourage Facebook and others to give us a choice between different kinds of algorithms stemmed from different values. How about someone who wants a challenge-oriented News Feed instead of a comfort-centred one? Or a minority-driven rather than majority-oriented? Or a text-privileged News Feed instead of a video-saturated one? Or a news-centred instead of a social-centred one?

We should disrupt current algorithms to confuse them in realizing what we want, so they give us more diversity.

This is all possible, and in fact Facebook has quietly tried them before. In 2014 they tested News Feeds with negative, saddening items on a fraction of users to see how it affected their mood. It proved too effective. Wasn’t market-economy about maximizing consumer options?

On a legislative level, states should start regulating algorithms, since they are the first steps toward an old science-fiction fear of human-made creatures who get out of control and come back to haunt us.

Algorithms should cease to be seen as industry secrets. They now control what we buy, who we date, what we read, and even what we think. They exert more power on many aspects of our lives than states do.

On the other hand, Facebook’s dream of providing internet access is, in the words of media scholar Dan Gillmor, an attempt at providing utility. Which state allows unregulated utility?

The problem is not the nature of social media, but their present architecture. To create a different world, we need different values, rules, and ultimately, different algorithms.

Hossein Derakhshan (@h0d3r) is an Iranian-Canadian author, freelance journalist and media analyst. He is the author of “The Web We Have to Save (Matter)” and the creator of “Link-age”, a collaborative art project to promote hyperlinks and open web. This article was originally published in Internnational Business Times UK in June 2016.

--

--

Startup Grind
Startup Grind

Published in Startup Grind

Stories, tips, and learnings from and for startups around the world. Welcoming submissions re: startup education, tech trends, product, design, hiring, growth, investing, and more. Interested in submitting? Visit our submission form here: https://airtable.com/shrShpeN89HrzCzOB

Hossein Derakhshan
Hossein Derakhshan

Written by Hossein Derakhshan

Researcher at Harvard's Shorenstein and MIT Media Lab. Spent 6 years in prison over blogging in Iran till 2014. hoder at hoder dot com