Facebook needs to Pop its Filter Bubbles and Optimise for Debate

Si Hammond
3 min readNov 29, 2020

Sometimes I stick my head in Facebook. When I did so a while ago, I was faced with this:

The share, the original source responses and the needed counter post

The abusive image was admirably rebutted but the viral nature of the share means it effectively has to be rebutted or debunked for each share.

It makes me sad but it also gives me a stinging reminder of what is wrong with Facebook, Twitter, etc. Not only am I going to summarise that but I’ll naively outline an approach for fixing it.

As a piece of viral content, this is distressingly typical. Maybe you don’t see it because you have curated your friends list to people who only see the world your way. The social media platforms make it very easy to do so. In fact, they are specifically engineered to give you a very narrow view of the world which is ‘engaging’. What does engagement mean? It means whatever pushes your buttons to get you to react and share. Base emotions are the way to go: fear, rage, lust and kittens do well. Truth doesn’t come into it.

The social platforms generally avoid questions of truthfulness because, well, it’s generally hard and exposes them to accusations of impartiality and censorship. The counterfactual postings from outgoing president Trump have notably forced interventions by them but it’s an ad hoc solution that doesn’t scale and hangs on the perceived credibility of the platform and its cited sources.

How can you fix this in a scalable and credible way?

The defence against bad, dangerous ideas is — and has always been — vigorous, open debate. Different viewpoints are put together in some arena (court, parliament or presidential debate) and allowed to compete under fair rules. Rather than sending the indefensible into dark corners to find a sympathetic audience, Facebook could be keeping it in the light and public scrutiny.

To focus debate rather than forking it would require a structural transition on a scale of the introduction of the newsfeed. It would a headache to design and engineer. There would be an inevitable pushback. Yet without it, Facebook is a system sitting on an explosive positive feedback look when it really, really needs a stabilising negative feedback loop that converges on the truth. For its own sake and for ours.

That sounds grand but what would this look like in practice?

It turns out we don’t have to look any further than Reddit or Stack Overflow for a clue. Heck, even the Daily Mail comments section has a crack at it: Voting. Voting can float responses that are generally agreed to be valuable rise to the top where they are more likely to be seen. Low value or extreme responses are marked as such and sink to the bottom.

Shared posts should bring the debate with them, respectful of privacy scope. This prominent context is the crowd-sourced response from the widest possible audience, presented to give a balanced view. Like Wikipedia, where the page on a contentious topic is just the tip of the iceberg with the talk page below the waterline.

There are a whole bunch of details and extensions of this idea but I wanted to get the gist of it out in a simple message, reiterating the title of this post:

Facebook needs to burst its filter bubbles and optimise for open, rigorous debate

Who knows, this might be the most engaging experience of all.

--

--