The cyborg future of truth

This is an expanded text from a speech given at the Imagine Solutions conference on Feb. 26, 2018.

Truth and trust are in a slump. Everywhere you look, there are fresh challenges to traditional creators of truth and advocates for truth: from scientists, to educators, government and civic agencies, the news media and the business community.

For several years, we at Pew Research Center have paid special attention to these issues because they are so important to our society. And we have been finding evidence of a demoralized society when it comes to the prospects for truth:

· Just before the 2016 U.S. presidential election, 81% of Americans believed that supporters of Donald Trump and Hillary Clinton could not agree even on the basic facts.

· Just after the election, 64% of U.S. adults said fake news causes a great deal of confusion about basic facts tied to current events. And 23% reported having shared fake news themselves — sometimes inadvertently and sometimes knowingly.

· Relatedly, in a Knight Foundation/Gallup poll from last year, 56% said “fake news” is “a very serious threat” to U.S. democracy, while 32% said it is a “somewhat serious threat.”

· The problem has even become personal: 26% of Americans (including 46% of those ages 18 to 29) say they have had false information about themselves posted online.

· One study from Stanford University found that over 80% of today’s middle school students cannot distinguish a news story from a sponsored advertisement.

I want to be clear that when I speak of truth, I’m not trying to be grandly metaphysical. I want to focus on practical facts that are based on our shared understanding, our common reality.

Here is an example of how this kind of truth struggles to find a foothold in the midst of a horrific news event. I stress that every institution tied to truth can cite its own version of this dynamic.

Last October, Stephen Paddock broke the window in his hotel room overlooking the Las Vegas strip and began firing 1,100 rounds of ammunition into the crowd below. The attack ended 10 minutes later with 58 people dead and 422 wounded.

Within minutes of the shooting, discussions began on the “/pol/” channel of the 4chan online message board. (“Pol” stands for “politically incorrect”). Users there game planned how to make it look like the shooter was left wing. One thread was called, “CONTROL THE NARRATIVE. DON’T LET FAKE NEWS TAKE CONTROL.” (Note: Two roundups of this material were particularly helpful in reconstructing the chronology: Buzzfeed’s Ryan Broderick tweeted extensively about these efforts, and Kevin Roose wrote about them in The New York Times.)

Source: @broderick on Twitter

After the local police announced that Paddock’s girlfriend, Marilou Danley, was a “person of interest,” the trolls zeroed in on another person — her ex-husband, Geary Danley, who had nothing to do with the shooting. He became a good target, though, because his Facebook profile showed that he liked Rachel Maddow from MSNBC and the left-leaning group MoveOn.org.

Source: @broderick on Twitter

The word catapulted forward on the “/pol/” channel and then was covered by sympathetic websites like Gateway Pundit: Geary Danley (not Paddock) was the shooter and his motives might be tied to his liberal point of view.

Source: @broderick on Twitter

The Google news algorithm, which is primed to capture new material during breaking news events and to boost stories that draw lots of search queries, picked up the 4chan chatter and displayed it for a time in the most prominent place of its “top stories” list for searches about Geary Danley.

Source: Mashable

On Facebook for a time, the news-highlighting system cited a false story on its “trending topics” page from the Russian news site Sputnik saying ISIS was seeking credit for the shooting. And at one point, Twitter also featured a similar news story from the Daily Mirror in its “top news” section for searches about Las Vegas.

Source: digg

Elsewhere on Facebook, a page was created for a fictitious group named “Melbourne Antifa,” and posts from the page that falsely claimed one of its members was responsible for the shooting started to go viral. The “Antifa” reference stands for “anti-fascist” — a group that opposes the alt-right. In other words, the right-wing trolls were trying to blame their enemies for the shooting.

On YouTube, there were prominent videos falsely claiming that people being interviewed on TV were “crisis actors” hired by the government to stage fake news events designed to help left-wing causes.

Source: @broderick on Twitter

The unfolding info-war over the meaning and facts of the worst mass shooting in American history was not a one-off incident, wrote New York Times reporter Kevin Roose. “Over the past few years, extremists, conspiracy theorists and government-backed propagandists have made a habit of swarming major news events, using search-optimized ‘keyword bombs’ and algorithm-friendly headlines. These organizations are skilled at reverse-engineering the ways that tech platforms parse information, and they benefit from a vast real-time amplification network that includes 4Chan and Reddit as well as Facebook, Twitter and Google.”

Clearly, we are in a period where discovering the truth about public life can fall prey to actors who want to exploit the vulnerabilities of the new information ecosystem.

We have a couple of hundred years of evidence that fealty to the facts about our shared understanding, our common reality, can yield great things. Consider the remarkable finding by the economist Amartya Sen that in the history of the world, there has never been a famine in a system with a democratic press and free elections. A central reason is that famines are a product not only of a scarcity of food, but also restrictions on evidence and debates about solutions.

Not only do facts matter in that way, but they also underlie justice. Facts, as Vaclav Havel argued, give power to the powerless. Facts are democratic. The rise of modern facts in the scientific revolution allowed societies to move away from reliance on the word of the church or the word of the king as the final unerring word on the truth.

Just as importantly, facts are tied to trust, and trust is what binds people and societies together. You can’t solve problems if you cannot agree on the facts of the matter.


How did we get into this mess?

The Las Vegas shooting highlights three major problems that confront truth in the new information order.

First, we live in a world of “total noise” — a notion described by the great American writer David Foster Wallace.

A key tactic for the new anti-truthers is not so much to get people to believe in false information. It’s to create enough doubt that people will give up trying to find the truth and distrust the institutions trying to give them the truth. There’s a term describing this concept that was created by Stanford history professor Robert Proctor: It’s “agnotology” — a neologism combining the Greek words: agno (“not knowing”) and logy (“the science of”). In a world overflowing with information, the simple act of sowing doubt is a really effective way to harm social cohesion.

The second reason we are struggling with truth is tied to political polarization and the way it affects people’s judgments about civic information and about each other.

In the past generation, both parties have increasingly purified themselves ideologically. There used to be significant numbers of conservatives who considered themselves Democrats and liberals who considered themselves Republicans — and they were often instrumental in getting policies enacted.

Today, the median Republican is more conservative than 97% of Democrats. That number was 64% in 1994. Meanwhile, the median Democrat is more liberal than 95% of Republicans. Almost 25 years ago, that number was 70%. There are sharply rising proportions of partisans of both parties who say that those on the other side not only have crummy policy ideas, but they are unworthy people. For the first time in Pew Research’s polling history, partisans on both sides have “very unfavorable” views of each other. And many in both parties think the ideas of their opponents are not just bad — they threaten the very well-being of the nation.

Additionally, polarization now has a personal dimension. In both camps, partisans are much more likely to feel that those in the opposition are closed-minded, immoral, dishonest and unintelligent than they were in the past. The most ideologically consistent citizens don’t want to live near those with opposing views, and some even say they don’t want to do business with them or date them.

This means the motive for partisans to engage now is not so much to establish evidence and proceed from there. Rather, it means advancing arguments that make their team look good and the other team look despicable.

The third reason truth is having a run for its money is that “attention economy” businesses — especially social media platforms — incentivize what political cartoonist Tim Kreider calls “outrage porn.”

The business model of social media platforms is to sell our attention to advertisers, and there is plenty of evidence that people are drawn to incendiary material. For instance, Pew Research Center has found that the Facebook posts from members of Congress that express the most indignant speech draw the most “likes,” “comments” and “shares.” Attention economy engagement metrics are more activated by emotionally galvanizing material than by factual matters or deliberations that seem to be heading somewhere.

That leads me to a new feature of the information ecosystem and the fourth factor giving truth a run for its money now. There are non-human actors that are part of the plot — bots and algorithms.

In the service of the platforms or outside actors, algorithms are helping shape our information environment. They are programmed to take signals from us about what we like and where we allocate our attention and keep dosing us with more of the same. They are attuned to our passions and interests, not to anyone’s interest in objective truth. One of the main reasons people can’t agree on the facts is that my algorithm-attuned info-streams are different from yours.

How do we get out of this mess?

A good starting point would be to consult the lessons of history.

In her book, “The Printing Press as an Agent of Change,American scholar Elizabeth Eisenstein looked at the world after the year 1450 when the printing press made information much cheaper to produce and easier to spread.

It turns out that one of the greatest impacts of the printing press was to give new tools and new life to those who believed in folklore and the occult, those who practiced alchemy, and those who pursued witchcraft and demonology. Eisenstein says they effectively exploited the printing press for 150 years before they faded from public life.

Many generations later the scientific revolution and the Enlightenment emerged largely as a product of and a response to that messy world. Out of that, a couple of important notions about the truth and how to find it became clear:

One of the breakthrough ideas was that truth is best established by a process of induction — that is, the gathering and examining of data, rather than by a process of deduction — that is, merely thinking about stuff. (See Mary Poovey’s “The History of the Modern Fact: Problems of Knowledge in the Sciences of Wealth and Society”)

Another breakthrough idea was that truth is best extracted from data that is gathered dispassionately, rather than by self-promoting advocates. And numbers — statistics — are an especially useful way to demonstrate dispassion. (See Theodore Porter’s “Trust in Numbers: The Pursuit of Objectivity in Science and Public Life”)

Yet another breakthrough was the notion that truth isn’t convincingly established unless there is open sharing of data and analysis, so they can be verified and tested by others. (See Karl Popper’s “The Logic of Scientific Discovery”)

Finally, truth doesn’t become terribly useful to anyone unless there are common standards around it — definitions, instrumentation, units of measurement that make sure everybody is on the same page when they talk about what the evidence is. At the center of creating truth is a big social bargain that people all talk about truth with the same language and with agreement on what they are measuring and describing. (This is also covered in Porter’s book.)

What’s the future of truth?

At Pew Research Center, we solicit predictions from thousands of experts about the future of technology, and in some recent work we got a lot of differing views about the future of truth. Generally, technologists tend to be more downcast, feeling that bad actors will always find ways with new tools to torment others. The less-hopeful experts essentially argue that human nature is immutable and that means many people will always be victimized because, at root, they are tribal and easily manipulated.

But those who cite longer time horizons in their answers tend to be more hopeful that the lessons of history are comforting.

For starters, they argue that we are entering the golden age of data. Unprecedented amounts are being generated these days and that should mean those who pursue induction will have a field day as far into the future as anyone can imagine.

The more hopeful experts add that there are new software tools to test the new data dispassionately — that is, to minimize human biases and marginalize the most egregious fake actors in the information ecosystem.

Additionally, these experts note there is a substantial movement urging data transparency for others who want to fact check the statistics and analysis.

These experts essentially hope for a cyborg future, where human-machine combinations take advantage of artificial intelligence mechanisms that make it harder for trolls to capture our attention with “fake news” and where the doubt mongers who practice agnotology are less a threat.

A story about the cyborg journey to truth

Technology pundit Clive Thompson provides a nice example of how cyborg truth-seeking can work. In his “Smarter Than You Think,” Thompson tells a tale about his encounter with IBM’s artificial intelligence system named Watson before it was deployed to beat Jeopardy! champion Ken Jennings on TV (pages 286–288). In preparation for that showdown, Watson was tested in warm-up matches.

In one case, the Jeopardy! clue was to complete the phrase, “Toto, I’ve a feeling we’re not in …” Watson’s top guess was the correct answer “we’re not in Kansas anymore.”

One of the extra things Watson does when providing answers is to produce several results, rank-ordered by the machine’s calculation of their probability of accuracy. So, after giving the correct Kansas answer, Watson then spit out a second answer to the Toto question: “Steve Porcaro.” He was the keyboardist for the pop band Toto — so it was not a totally weird answer to a Toto-based question.

But Watson’s third answer seemed ridiculous: “Jackie Chan,” the kung fu actor.

Thompson wrote about the absurd answer in The New York Times, arguing that the “Jackie Chan” answer and some other glitches showed that Watson wasn’t quite a finished product. But after his story was posted, an online commenter figured out there was a Jackie Chan connection to Toto.

There is a scene in the movie Rush Hour where a street-savvy American detective is charged with babysitting a detective from Hong Kong — played by Jackie Chan. Chan is in the States investigating a kidnapping. In this scene Chan is asking the American to help with his investigation.

Now, there’s a scene from the sequel Rush Hour 2 (starting at minute 1:50), where the earlier scene is referenced as the two detectives are in a Hong Kong karaoke bar.

The hilarious Tito-Toto line was stored somewhere in Watson’s memory bank — and that’s why the AI-system figured out that Toto has a place in Jackie Chan’s life and why it’s not crazy to think of Chan as an answer to Jeopardy! question.

Maybe that’s a good model for the future of truth. Watson, the AI machine, will be an essential contributor. But it sometimes takes a human sleuth to add the final piece of evidence that nails down the truth.

Thank you.

Like what you read? Give Lee Rainie a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.