Facebook No Longer Hosting News in Australia is a Good Thing

Algorithm-driven news is a mess.

Ryan Peter
Disrupting Africa
4 min readFeb 19, 2021

--

Photo by Glen Carrie on Unsplash

Will Oremus at his article Facebook Just Banned News in Australia. Like, All of It tells us the story of how, in response to forthcoming laws in Australia, the social media giant will no longer allow links to news articles “down under” and anybody who posts an Australian news link to the platform will find it will no longer allow them to do so.

In an article I wrote for CNBC Africa recently, I called 2021 the Year of Social Media’s Reckoning (forgive the hyperlinks in that article, I don’t know why they aren’t cleaning up their opinion pieces lately). This is another step forward that I believe is a healthy one.

As most of us know by now, social media algorithms like those employed by Facebook are designed to continue to feed you around your interests. But, over time, they can actually change and manipulate your interests. In Facebook’s case, I don’t think this is necessarily always deliberate — at least not when it comes to news. What happens is quite simple, really. An article grabs your attention, you read it, it gives you a point of view and you’re skeptical. However, because Facebook thinks you have an interest in that now (since you clicked on the link and read the thing), it feeds you up something similar tomorrow. Your layer of skepticism breaks down a little as you, probably sub-consciously think, ‘But wait a minute, maybe there’s something to this. After all, if two people believe the same thing, maybe it’s valid?’ And besides, it’s an interesting idea — there is always the possibility that someone is out to get you. Paranoia is easy to feed.

Do that enough times and you begin to change your mind about things, because that’s just how human nature works. Cult leaders have known this. Tell people something for long enough and they will eventually believe it.

One of the best Star Trek episodes to watch is “Chain of Command” (Star Trek: The Next Generation). In it, Captain Picard is tortured by a Cardassian who tries to make Picard lie about the number of lights he sees, in an effort to brainwash him and make him come round to his point of view. It’s really great writing. Check out the clip:

It takes strong conviction, commitment to a broader way of thinking, and a resolve to not let yourself be driven by your own experiences to resist this sort of brainwashing. In the case of Picard it was obvious what the Cardassians were trying to do. At the end of the episode, however, he admits that at one stage he is sure he DID see more than four lights.

However, in the case of Facebook, it’s a subtle change — and subtle brainwashing is the worst.

I’m not building a case that Facebook is some evil company, what I am building a case for is that algorithm-driven news is simply not a good idea. We’ve experimented with it now and found it wanting. It’s great to get a news feed catered to your interests, but this has serious limits and is, in the end, proving to be more harmful than good.

In fact, I would wager to say that the algorithm approach has not only changed journalism in terms of the way audiences read and consume content (resulting in major profit losses and situations where good journalists are not being paid appropriately; resulting in a serious problem with the research that goes into news, and with factual reporting suffering and bias becoming the order of the day) but it has changed the way journalists themselves have approached making news.

Editors, for a long time, cared more about clicks than facts. Thankfully, we seem to be moving beyond that (even if too slowly), but more than that: journalists and editors have, in some ironic way, mimicked the algorithm-generated news approach in real-life newsrooms. They have created “echo chambers” at publications where writers have to conform to very abstract (and impossible to keep up with) views or risk being axed.

Take a look at the recent Bari Weiss incident at the New York Times. It’s one example of many. How is it that the New York Times, of all publications, cannot seem to tolerate pluralism and alternate views anymore? You might have an opinion on Weiss and might tell me that I am making unfounded generalizations about the NYT — that it perhaps still sticks to its liberal values of yesteryear. Perhaps it does, but the slow, slippery creep of closed-mindedness is certainly there. Just like with Facebook, all it takes is time and subtlety and that entire newsroom will operate in fear and suspicion where any alternate view, any contrary opinion to whatever the “establishment” of the day is, will not be tolerated. That does not make for healthy news or healthy opinion, it makes for a very real (no, not conspiratorial, but proved by history) possibility for dangerous propaganda.

Back to Facebook in Australia. I, for one, think this is a good move and might force people to have to get out their echo chambers and perhaps be exposed to alternative views as they go looking for news in other sources, or have to literally Google it in and go to a publication and actually see everything else going on in the world. Of course, Google is in its own way complicit to the problem, but this new Facebook development is, in my view, a step toward the right direction.

--

--

Ryan Peter
Disrupting Africa

Editor-in-Chief & Co-Founder of Disrupting Africa; ghostwriter, solutions journalist who helps leaders tell their story; spiritual writer and fiction author.