NewsGeist 2019 — what we talk about when we talk about news.

Ezra Eeman
Jun 12 · 5 min read

Athens, Greece. Birthplace of ‘demokratia’, and the place where some 200 news leaders and tech people came together last weekend to have an open and frank conversation about the state of news in the online world. It was my first NewsGeist, so I had only a vague idea of what to expect of this ‘unconference’ that was set up by Google but had no visual reference whatsoever to the big American tech giant.

Newsgeist is described as a gathering of practitioners and thinkers from the worlds of journalism, technology, and public policy who are trying to re-imagine the future of news. It is a place and moment for the industry to share projects and ideas, and tackle challenging problems together.

There is no fixed agenda, no overarching theme. Everything starts with a collective brainstorm on the opening night. During dinner, the attendees are challenged to come up with topics and the program is created on the spot. Whoever proposes something takes the lead the next day in a session. Rather than curating a program, NewsGeist curates the guests. This open approach works. Its secret: the clash of minds in the room and the clear Chatham House Rule that is in place:

In a way, NewsGeist is a measure of the state of news. Looking at all the suggestions for discussions you quickly discover what keeps people awake and what moves the industry.

Questions and Discussions

The number one topic that came back in a wide variety of questions and discussions was misinformation in the news ecosystem and adjacent topics like trust, transparency, freedom of speech versus the need to take down hateful content. Here’s a sample:

  • How do we cover political influencing in real time on platforms?
  • What to do when the first provider of fake news is the ruling country?
  • How might we strengthen the ties between trust and revenue?
  • Transparency in daily journalism: do audiences appreciate it? Is it worth the effort?
  • Breaking out the bubble: How do we reach segments of society we don’t reach yet?
  • Should the media help people to cope with reality and if yes — how?
  • Can we build a better hate speech detection platform together?
  • Is journalism to attached and corrupted by the story?

But the question that provoked the most engaging and thorough discussions among all attendees was this one: Can free expression be managed without killing it?

With the big tech platforms in the room, it became quickly a quite heated debate. The first thing that became clear to me was that the idea that big tech platforms are not concerned about the role they play in the dissemination of hate and misinformation is a myth. They do care. Because it hurts their overall vision and mission. Or if you are a cynic or a realpolitiker: their business model.

I was also surprised to see that there was a beginning of an understanding with publishers that it’s not necessarily useful to just shift all the blame to the platforms. In the end, we are in this together and given the scale of the problem we have to work together if we ever want to have any impact. Even if, at this stage, there is no consensus about the ways to best achieve this.

Here’s some of the thinking that went on in the room:

  • It all starts with defining what it is that we want to fight. We first have to separate the illegal from the legal. What is illegal should be discussed in courts (some participants proposed a new sort of internet courts) that can rule on content that is clearly illegal. What is legal but still harmful content should be treated differently.
  • When looking to harmful content we should perhaps not try to regulate content but rather behavior, meaning that it is important to understand the intent and context behind any given piece of content. An objectionable remark from a private individual that is only intended to be seen by a few friends sits in a different category than the same message that is deliberately seeded on social by activists to create maximal disruption.
  • An interesting notion that was presented was the freedom of speech vs the freedom of reach. While platforms might offer the first they don’t necessarily need to offer the latter. When a piece of harmful content rapidly gets traction across a social network — platforms should have monitoring mechanisms to spot the velocity and take down content before it reaches maximum virality.
  • Overall there was a concern on both sides of the discussion that self-regulation might be preferable over hard legislation. And even then there is a risk that too much cracking down on content by platforms will turn social media in a mall-like experience: safe and boring. Or worse even less free speech.

Some solutions were proposed to better fight misinformation and hateful content.

  • Installing independent sector-wide self-regulatory councils to oversee content governance and enforcement on platforms.
  • More open data sharing from platform to independent third parties to investigate and research patterns and sources of misinformation.
  • Individual liability for platforms
  • The # of people that monitor and enforce community guidelines for platforms should be related to the number of users or number of posts/content items
  • Exploring nudge concepts to disincentivize hate content: for example more clearly visible warning signs or authority labels on platforms.
  • Making the reverse tracking of fake content easier by timestamping content (visible/invisible) when it is published.
  • A greater emphasis on trusted content. This will need alignment between the different trust/fact-check initiatives currently out there.

In the end, I returned with a stack of post-its full of scribbles. A lot of inspiration. A lot of new connections and also a lot to digest.

Feel free to reach out to me if you want to hear more about the discussions we had or if you have a brilliant solution for any of the above questions: eeman@ebu.ch

Shapes & Ideas

Venturing in the adjacent possible; ideas about media, technology and life.

Ezra Eeman

Written by

Head of EBU Digital and founder of @Journalism2ls. https://be.linkedin.com/in/ezra-eeman-8a5ba64

Shapes & Ideas

Venturing in the adjacent possible; ideas about media, technology and life.