On the demand side of truth

As the media fights misinformation and mistrust, we should also look to the demand side of the equation by helping consumers get control of their newsfeeds, says Dan Gillmor.

Walkley Foundation
The Walkley Magazine
7 min readAug 29, 2018

--

Cartoon by David Pope.

By the end of 2016 it became clear to anyone who was paying attention that misinformation, a perpetual problem in our media ecosystem, had become a crisis. The rise of social media and its split-second posting/sharing culture had given malicious actors their most powerful platform in history.

With a few notable exceptions, traditional journalism was eroding. Fierce competition, especially from online platforms boasting enormous scale and efficiency, was undermining the business model.

The journalism craft’s longstanding flaws (including, far too often, institutional arrogance) had undermined the public’s trust. Ambitious and already powerful people around the world were doing their best to poison the public even more against journalism and the whole idea of truth. Their power and ambitions were best served through spreading confusion or outright lies, no matter what damage that did to our ability to make decisions based on reality.

No-one disagrees that a key response to this barrage of badness must be to improve our information supply. But in recent times, in my view, we’ve paid too little attention to the demand side of the equation.

Supply and demand were never entirely separate before the internet arrived. Now, given the ubiquity of creative tools and access to information — combined with big data, algorithmic targeting, artificial intelligence, filter bubbles and simple sharing — they are more deeply intertwined.

In this environment, even as we upgrade our information sources, we have to upgrade ourselves — as users of media who consume, create, share, and collaborate in our endlessly complex ecosystem. And we have to find ways to do this at scale — reaching as many people as possible to help them, above all, to be critical thinkers who would use media with integrity.

People call this many things, including news literacy, media literacy, and other names. But it’s rooted in civics and especially critical thinking, a skill that we need to instil in absolutely everyone, starting at a young age, and then reinforce throughout our lives. It needs to be as close to universal as possible — we need massive scale, as the tech crowd would call it.

We can get to scale with the help of three kinds of institutions.

  • Schools and community institutions like libraries, with direct support from governments
  • Media organisations, especially the news media
  • Technology platforms including Facebook, Google, and Twitter

Despite some very useful one-offs and individual programs, we don’t remotely have scale in any of these areas. This represents collective civic failure, not just institutional setbacks.

Let’s look at each of those sectors in a bit more detail.

As far as I can tell, relatively few education systems around the world are have made media/news literacy a priority. In the United States, the federal government has all but ignored it, and only a few US states have made it a priority. Media/news literacy training does exist, but it is not systematic or pervasive. Moreover, there are rivalries among the people and organisations pushing the related digital literacies (media, news, information, etc.) that are deeply intertwined but not identical. Funding has been low, but is gaining strength in the face of what everyone recognises as an emergency. But it is telling that the major US media-literacy umbrella group, the National Association for Media Literacy Education, is so starved for resources that it meets only every other year.

Meanwhile, journalists have all but ignored what I believe should have been a core mission for the past half-century or more: being community leaders in this arena. For reasons I cannot fathom, news organisations have typically said their work speaks for itself, with no further explanation — a stance that has greatly contributed to the distrust they’ve earned from the public.

News organisations could do an enormous amount of good by employing, among other things, more transparency and conversation and collaboration with the people and institutions in the communities they serve. My work these days includes a collaboration with newsrooms that we hope will help them engender more trust in what they do for the right reasons — namely sound, useful, and community-driven journalism. We’re collecting what we consider best practices in these areas, and working with small organisations that promote them, to seed newsroom projects that — we hope and expect — will raise the communities’ “news awareness”, as we’re calling it.

Cartoon by Jon Kudelka.

The media are larger than journalism, of course. Entertainment, public relations, and advertising have featured fiction and manipulation as part of their business models, so perhaps it is futile to ask them to be part of a movement to restore trust. But maybe not. Companies are realising that a world in which no one trusts anything is a world that jeopardises even established brands. There are discussions underway in the US to create a wide-ranging public-interest advertising campaign encouraging social media users to be more careful about what they share. This and other initiatives could make a real difference.

Who could make the biggest difference of all in the short run? The technology platforms, which almost define the word scale at this point. They need to empower their users — people like you and me at the edges of the network of networks — to take at least some control of our own information flows and data.

The major internet platform companies have, for the most part, been less than eager to help us. There are, from their perspectives, good reasons for this reluctance. But it’s time for them to do it anyway.

Facebook, in particular, is facing a nearly perfect storm of anger and frustration from users and governments. One cause is the collection and use of users’ data by the company and third parties Facebook has invited into its data/financial ecosystem. Another related cause is its centralised control of what its users can see in their “news” timelines and advertising displays — and the abuse of the platform by third parties that have taken advantage of what look like lax controls.

These are related for several reasons, not least the fact that Facebook has become the dominant conversation space online, and one of the dominant online advertising companies of our times. How Facebook manages its ecosystem has therefore become a more-than-legitimate issue for its users, and society as a whole.

But the same issues afflict platforms like Google and its YouTube subsidiary, and Twitter, though in different ways. Google search and YouTube recommendations, like Facebook’s newsfeed, are a black box from the outside: programmer-designed algorithms that create filter bubbles and are frequently gamed by malicious actors. Twitter, likewise, has demonstrated an ongoing inability to police its ecosystem to filter out the garbage (or worse) that so often makes the experience untenable for some users.

That’s also the rub. The platforms should not be — and have said they don’t want to be — the internet’s content police. They don’t want to decide what’s true or false. But they have enormous, even unprecedented, power. This has led too many misguided people to call on the platforms to be what amounts to the editors of the internet. Why are people assuming that the solution lies in the corporate policies, and programmers’ decisions, inside exceedingly centralised organisations? If you want censorship to be the rule, not the exception, that’s one way to get it.

Who should be making the decisions about what we see online? We should. But we need better tools to do it.

We should insist that the platforms provide dashboards giving us more control, both of our data and our online experience. Let us tweak the algorithms. Let us decide what material we want in our news feeds, and do so at a granular level.

We need options such as deciding what, if anything, should be filtered and how. We need ways to alter the curated feeds. We need the ability to do community-driven curation of our own. We need ways to break out of the “filter bubbles” the platforms’ algorithms create (on our behalf, to be sure).

I’m not a fan of government regulation — and absolutely oppose government control of speech — but if the platforms decline to give users more choices, perhaps governments will need persuade them otherwise. (Disclosure: Facebook is one of the funders of the recently established Arizona State University News Co/Lab, which I co-founded and direct.)

Despite our huge problems, I remain optimistic given the efforts I’m seeing to address misinformation. But even if everyone now working in this arena succeeds, that won’t be enough without much more focus on the problem by many more players.

We’ve taken the first steps. But until we collectively realise that this is a demand issue, not just a supply issue, we’ll guarantee failure.

Before we think about taking whacks at free speech in the name of improving our information supply, let’s first improve the demand.

Dan Gillmor is director of the News Co/Lab and professor of practice at Arizona State University’s Walter Cronkite School of Journalism and Mass Communication. A journalist for almost 25 years with a variety of newspapers in the United States, Dan was an early participant in the news industry’s digital transformation. He is the author of Mediactive (2009). Twitter: @dangillmor

David Pope is a Walkley-winning editorial cartoonist for The Canberra Times. Twitter: @davpope

Jon Kudelka is a cartoonist for The Australian. Prints of his work are available at kudelka.com.au. Twitter: @jonkudelka

--

--

Walkley Foundation
The Walkley Magazine

The Walkley Foundation champions the highest standards of journalism in Australia through our awards, events and magazine – join the conversation!