Echo chambers: the more things change, the more they stay the same
As children, our parents acted as the gatekeepers for our safety and wellbeing, keeping us in a safe, protected little bubble. Willing participants in the challenge that is parenting, they made choices for us on our behalf; what clothes we should wear, what books we should read, and what food we should eat. And in the majority of cases, they had our very best interests at heart. With a desire for us to become healthy, happy, well-rounded humans, all the thinking and decision-making was left to them. They gave us not just what we liked and wanted (ice cream for breakfast), but what was good for us — what we needed as well (broccoli).
As time goes on, and we graduate into functioning adults, we enter into new and different kinds of bubbles. But the gatekeepers are no longer our parents. One of those bubbles is information — and the gatekeeper is an algorithm, developed by the corporation that is Facebook.
Every bubble has a different agenda
What we see (or don’t) in our newsfeeds, is decided for us by this algorithm. An invisible, clever beast, it interprets more than 100,000 signals to reveal what shows up, and when, without either our knowledge or consultation. With every click, every like, every hover of a mouse, we are co-curating what displays in our newsfeed. The Guardian reported that, “…Many users — 60%, according to the best research — are completely unaware…” of this algorithm’s existence, but even if they did know, would they really care? What we see is a result of our digital behaviours — a reflection of what we want. Mark Zuckerberg was once quoted saying,
“A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa…”
but it’s a very fine line between what we consider relevant and personalised versus the editing out of news and information that we should but don’t see.
If there’s anything that Cambridge Analytica and the power of big data, psychographics and the electoral process has taught us, it’s that if you believe that Donald Trump makes for an inexperienced, racist, misogynist leader of the the world’s most powerful country, it’s very likely that the content you’ll see displayed in your newsfeed will reflect that Donald Trump makes for an inexperienced, racist, misogynist leader of the world’s most powerful country. What you see is what you believe, and when your thoughts are reflected back to you, why would you bother exploring other sources to get a balanced point of view? Or in some cases, fact-checking, to get a truthful point of view?
At a time when the Australian government has decided to put the human rights of a minority group to a vote, we’re seeing countless arguments featured in the ‘No’ campaign that have already been proven to be untrue in every other country that has legalised same-sex marriage. Some personal favourites; that marriage equality could subsequently lead to people marrying the Harbour Bridge, that marriage equality will help to stop political correctness ‘..in its tracks..’, and the obvious winner, that marriage equality will turn our children gay. True or not, this information has been communicated, at scale, to those predisposed to see it as fact. Confirmation bias at it’s best (or should I say, worst?)
Facebook’s algorithms are constantly taking our measure; choosing what we see, and making thinking redundant. Our newsfeed further confirms our biases, unconscious or otherwise, keeping us lazy, inside our bubbles, with others, just like us. If by using Facebook we’re stripped of the ability to see a balanced view and run the risk of becoming closed-minded, only willing to consume information that matches our ideals — why have we not all given it up?
Our view of the world has always been tinkered with
Perhaps we have been willing collaborators in this intrusion because it is the natural extension of what we have always been used to. In the age of mass media, we have never really been free from paternalistic oversight.
A little less than a century ago, William Randolph Hearst, loosely immortalised as Citizen Kane, became the world’s first mass media mogul. He pioneered newspapers as we know them today, reliant on the value of their engaging ‘added value content’ just as much as the utility of the ‘news’ they conveyed. Famous for making or breaking presidencies, he was well-known for having said,
“…you supply the pictures and I’ll supply the war.’
He was arguably far more powerful than any media mogul you could choose to name today. His trick? Knowing what type of ‘content’ would appeal to his audience. He had a talent for co-opting hopes, fears, neuroses and prejudices, in a way that shaped and reframed our outlook.
Facebook today, has provided an expert platform for this; the data is abundant and the tech is slick, but fundamentally — the model relies on giving the people what they want (and maybe, deserve).
Facebook has got to ‘know’ us perhaps better than we know ourselves. On our ‘likes’ alone, it can predict our race, sexual orientation, relationship status and drug use. With 90% of the world’s data captured in the last two years, advertisers and media owners have a huge moral and ethical responsibility not to exploit these algorithms irresponsibly.
The real rub seems to be that in an age of hyper-personalisation, where our media consumption is a precise reflection of our attitudes and beliefs, we are not compelled to look outside our bubble. But perhaps the antidote to the bubble conundrum is the same as it’s always been; something that lies in our own hands. A responsibility to seek out new information, uncomfortable opinions and opposing views and not simply rely on that daily edition of our newsfeed that we ourselves helped curate.
To complete the analogy, there reaches a point in every child’s life, where they outgrow the decision-making of their parents, and break free of the bubble.
The question is, now as an adult, who is your gatekeeper?
Carly Drew, Strategy Director