Image by Oliver Hammond. Used under CC 2.0.

How the hell did we get here?

Peter Martyn
Thoughts On Journalism
11 min readFeb 13, 2017

--

As I write this, journalism is under siege as never before.

Last November, the Oxford English Dictionary announced that it had selected “post-truth” as its word of the year for 2016. Then, after the election of Donald Trump to the U.S. presidency the same month, alt-right entered the lexicon. And shortly after his Jan. 20, 2017, inauguration, President Trump’s adviser Kellyanne Conway coined alternative facts as a way to describe allegations that cannot be determined to be true. Journalists, after a few days of anguish (especially by the Wall Street Journal), began calling alternative facts what they are: Lies.

Now you may recall that the mid-20th century American journalist I.F. Stone was fond of saying, “All governments lie.” But today an entire movement, inextricably linked to digital media, has taken some factions’ lying far beyond the expected “spin” of torqued facts, manufactured propaganda and trial balloons. It is essentially trying to construct an alternate universe where science and evidence are smashed down with opinions masquerading as fact, and where deliberate lies are pipelined to a population that’s been inured to anti-intellectualism and taught to mistrust journalists and professors and all kinds of “book-learning.”

In December, former chess grandmaster and Russian dissident Garry Kasparov trenchantly observed that “the point of modern propaganda … is to exhaust your critical thinking, to annihilate truth.” American conservative thinker Charles Sykes warns, “The real danger [in today’s post-truth climate] is that, inundated with ‘alternative facts,’ many voters will simply shrug, asking, ‘What is truth?’ — and not wait for an answer.”

How did we get to this state of affairs?

In part, it’s a well-known story: Traditional journalism, especially print — newspapers and news magazines — had been battered by economic storms since the 1990s. The U.S. stock market crash of 1987 left in its wake a stagnant economy that persisted through the early part of the following decade. It resulted in erosion of the advertising revenue that had buoyed newsrooms through most of the 20th century. With the mainstreaming of internet services in the ’90s, the golden goose was cooked: first Craigslist’s free classified advertising demolished newspapers’ broad base of consumer-supported little local ads, then search engines and news aggregator sites such as AOL, AltaVista, Google, Yahoo! and others siphoned away readers.

Circulation had for years been fraught, with ever-increasing churn[i] rates, that sometimes approached 100 per cent. Readers began to play the papers by letting deeply discounted circulation deals (a below-cost rate for a month or three months) expire, then waiting for the circulation department to offer another short-term deal. It became apparent that only a small percentage of reader households — mainly those with a decades-long family history of subscribing to a particular newspaper — could be counted on to pay full price and keep renewing year after year. Free daily commuter papers flourished briefly, but their advertising base contracted as readers increasingly turned to online news sources.

Broadcast television and radio news, with smaller newsroom staffs than the quality print dailies (which in most cities and towns did the heavy lifting of covering city halls and provincial or state legislatures and national politics, and broke important stories that the electronic media then covered) suffered similar contractions as cable TV seduced viewers and advertisers.

The convergence myth

In the late ’90s and early 2000s, a series of “consolidations” took place, as large corporations bought up smaller, once-profitable media operations. They often used an expected “convergence dividend” to justify to their shareholders borrowing money at double-digit interest rates. The convergence dividend stemmed from the assumption that news staff were underutilized, and that reporters and editors, if better managed, could simultaneously produce both print and electronic stories. This turned out to seldom be the case — print and broadcast demand journalists with different skillsets. The expected savings didn’t materialize. But interest payments from borrowing based on the false assumption did.

The result was ongoing staff cuts, as owners and managers tried to staunch the financial hemorrhage by laying off experienced journalists (who at the time made solid, middle-class wages). Turned out they were the very people responsible for providing stories and maintaining the quality of their owners’ key product: news.[ii]

What followed was a decade of loss: of newsroom jobs, of institutional memory, of historic newspapers themselves, of news coverage, of diversity of voices interpreting the news.

Moreover, well-publicized mistakes (the New York Times’s allegations of WMD in the run-up to the invasion of Iraq; the Washington Post’s Pulitzer-winning Jimmy’s World, were just two examples) were compounding citizens’ increasing cynicism in official government statements and brought a disheartening slump of public faith in mainstream news itself.[iii] Trust plummeted from a high of 76 per cent in 1976 (after Watergate) to 32 per cent in 2016.

That was grim indeed, but worse was to come.

In the years after the introduction of the iPhone in 2007 and the Android operating system the following year, the digital ecosystem lurched from desktops and laptops to handheld devices. Social media — principally Reddit, Facebook and Twitter — were embraced by North Americans of all ages, not just for keeping track of far-flung family, but in many cases as a major source of news.

Algorithms and confirmation bias

Into that bleak landscape rode computer algorithms that covertly sorted users’ news feeds and internet search results, according to what they and their friend network had previously searched out. Serendipity — the chance of seeing an unexpected news item or a contrary opinion — began to vanish. The internet turned into a vast echo chamber of confirmation bias: Most news items and search results reinforce one’s prior beliefs.

I often joke that “whatever strange idea you wake up with, you go check the internet and find it’s true.”

There is danger to society in this. One example: search “vaccination causes autism” and you’ll easily find claims that yes, it does (in spite of the fact that the falsified research results that seemed to link autism and vaccination were convincingly debunked by medical researchers, and The Lancet retracted the original paper in 2010). The result has been outbreaks in developed countries of communicable childhood diseases such as measles.

Unfortunately, decades of research has shown that correcting errors is as likely to cement people’s belief in the erroneous report as it is to change their minds about it — a tendency propagandists exploit.

Science itself seems easily discredited in the public’s mind. Perhaps many of us were turned off in high school by weak teachers of biology, chemistry and physics, and the statistics needed to understand much of those subjects. But it’s not only because science can be difficult. A more fundamental problem is that real science rarely yields cut-and-dried results. It needs patience to understand. And that’s a rare commodity in an era of small screens and diminished attention spans.

The scientific method involves researchers identifying a problem or puzzle, formulating a hypothesis (theory) that explains it, then testing that hypothesis through experiments. Finally, their experimental method and results are published so other researchers can try to duplicate the results.

If the experiments are successful, evidence for the hypothesis starts to pile up. But there are many potential pitfalls: poorly designed experiments can distort results; researchers’ unconscious biases may slant outcomes for or against the hypothesis; funding by corporations with a vested interest in the results also appears to influence outcomes. And there has been no shortage of outright frauds published in peer-reviewed journals — which we teach university students in many fields, including journalism, to regard as the gold standard of fact-based academic evidence.

Experienced science reporters try to winnow these elements to produce valid yet readable stories, without “dumbing down” the science. But news consumers don’t always appreciate the subtleties, particularly when a “next great thing” announcement appears to contradict an earlier report: such stories often elicit nothing more than a confused, WTF shrug, or, as Sykes says, a frustrated “What is truth?”

Science reporting — especially medical science reporting — therefore requires specialists, independent deep-pocket funding, and a real ability to craft stories that will be understandable to consumers yet not debase or over-simplify the science.

A torrent of anti-intellectualism

The danger today is that large swaths of the population reject virtually all news purveyed by responsible media. They also reject empirical, evidence-based science. Powerful governing bodies use that attitude to advance obscure agendas, and in some cases going so far as to destroy valuable scientific data (in Canada and perhaps the United States).

This rejection of 300-year-old Enlightenment values is not new.

In 1980, biochemistry professor and sci-fi writer Isaac Asimov wrote: “There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.’” He added, using a pejorative that resonates today: “ It may be that only 1 per cent — or less ― of Americans make a stab at exercising their right to know. And if they try to do anything on that basis they are quite likely to be accused of being elitists.”

Asimov, 37 years ago, hit the nail on the head. Today, the undercurrent of anti-intellectualism in North America and Europe has become a Niagara of populist, xenophobic rhetoric and fear. While orchestrated, high-profile terror attacks are horrific, they kill far fewer people than guns or motor vehicles. But fervent populists, aided and abetted by attacks on the media’s credibility and the internet echo chamber, can bury such logic, as they tend to do with other empirically demonstrable facts that don’t play into their agendas.

In spite of this trend, a significant proportion of the population remains adamantly defiant: the French still celebrate life despite the Charlie Hebdo, Bataclan and Bastille Day attacks. Hundreds of thousands of North American women (and men) rallied in pink “pussy” hats, Jan. 21, protesting the previous day’s inauguration by the newly minted “us-against-them” U.S. president. After the vicious shootings of men praying in a Quebec City mosque, thousands turned out in Montreal and Quebec City for their funerals, and hundreds surrounded mosques in cities and towns across Canada.

Navigating uncharted territory

Journalists, though assailed by the economic and credibility factors I’ve enumerated, are heartened by the “Trump bump” in paid subscriptions, which may translate into more time and resources. Tech, too, is trying to help: Facebook now allows users to flag posts as “Fake” and if third-party fact-checkers agree, they may get a “Disputed” label (which, in my opinion, is a craven substitute for Fake, False or Lie). Slate has devised an extension for Google Chrome called “This is fake” which depends on Facebook users’ flagging as well as a database “of problematic ‘news’ sources” while trying to avoid conspiracy theories — which points to the difficulty of differentiating between sincerely-held opinions and deliberate lies.

I should note that news consumers are no longer satisfied with the simple “what happened” formulation that, in the century before the internet, was the mainstay of news. Today, that straightforward need to know may be satisfied by live major-media video feeds, or by on-the-spot citizen reporters — with luck, their narrow-perspective live videos and breathless audio will be vetted and verified by trained journalist-editors. But not always. So news consumers need to think critically. As indeed many do. A large percentage of those who get a lot of their news from Facebook — where news is just one component of the content flow — are reported to be skeptical.

Today, people want to know not just what happened, but why and what’s next? They want analysis — help in understanding an increasingly complex world. But providing analysis, which some consumers confuse with opinion,[iv] opens journalists to unmerited accusations of bias, of letting their opinions — or their worldview — shape how they report the facts. [Full disclosure here: my perspective is based on empirical, verifiable facts from responsible sources — they have shaped it since I was able to understand my parents’ words and sentences.]

At one time in postwar Europe, newspapers were unabashedly “left” or “right.” Anyone considering themselves a well-rounded intellectual would read several papers a day, often spanning the political spectrum, then form their own opinion about what to believe. That is sensible, if archaic in our hectic world. Today, we want instant news: perhaps a 140-character tweet is enough to make up our minds on an issue of global importance. So journalists are not only obliged to verify everything they report or retweet, and to engage and interest their users/readers/viewers, but also to be concise in their presentation so news consumers can get on with their busy lives. It’s telling (and heartening) that in some surveys, social media users say that the reputation of the news organization that originated the reporting is paramount.

The digital journalism world has a saying, “Transparency is the new objectivity.” News consumers value transparency: they deserve an explanation of where reports come from.

In a news environment where erudite consumers understand that reporters, like everyone else, have a worldview distilled from their upbringing, their culture, their experiences, users/viewers/readers understand that objectivity is an impossible ask.

So journalists must reveal how they got the story, who their sources were, how it was put together. Thoughtful news consumers will decide if they trust the reporter, and the story. Their analysis — of the story, of its sources, how its narrative is assembled — can take place in seconds; we’re all capable of rapid analysis if we trust the source. For those who want to dig deeper, though, journalists must make consumers’ research easy: Show them the sources, and how you verified them.

For journalists, it is a time of introspection, of trying to lay a course into an uncharted future. The Trump administration has identified the news media as the opposition (though perhaps right now a more potent opposition is satirical late-night television, particularly Saturday Night Live, which has seen its ratings soar). Being in opposition is not a comfortable role for journalists, at least in the U.S. They are accustomed to dealing in verifiable evidence, rather than disputing false news. Now they must gird for the uncertain but possibility-filled future, must be prepared to counter lies with facts and propaganda with clarity.

Lest we fail, remember: history is written by the victors (though historians disagree strongly, saying history is written by historians — you choose). Either way, journalists need to work diligently lest, as George Orwell wrote, “lies will pass into history.”

Footnotes

[i] “Churn” was a term used to quantify the percentage of households that let their paid subscriptions lapse in a given month. They needed to be replaced with new subscriptions in order to maintain a newspaper’s paid-circulation number, on which advertising rates were based.

[ii] News was a dual-market system: the media sold news to viewers and readers at a loss, then sold those “eyeballs” to advertisers at a profit; subscription revenue rarely covered more than distribution costs.

[iii] In my 40 years in the news biz, I saw (and made) my share of mistakes. While I examine news organizations with a critical eye, let me be clear: I have the utmost respect for the hard work shoeleather journalists put in to produce factual, verified stories.

[iv] Analysis (in journalism): a discussion or interpretation of events or trends by identified sources with a track record of expertise in the subject matter. Opinion: the author’s own interpretation of events or trends, which may draw on other experts’ opinions.

--

--

Peter Martyn
Thoughts On Journalism

Canadian career journalist Peter Martyn is the author of a post-secondary textbook due in 2020. He has taught multimedia and print journalism since 2007.