The Augmented Newsroom: How will AI impact the journalism we know?

Personalisation, trust, hope, and ethical standards: Can AI in newsrooms augment journalism? Or will the pitfalls outdo the benefits? A discussion about the augmented newsroom with Reginald Chua, Lisa Gibbs, and Mar Gonzalez-Franco, moderated by Ben Rudolph.

Reginald Chua, Lisa Gibbs and Mar Gonzalez-Franco, moderated by Ben Rudolph

‘How is AI being used in the newsrooms today?’, asked Rudolph to kickstart the discussion.

Chua drew on his experiences at Reuters to say that, for the moment, AI is mostly used to increase the efficiency and breath with which journalists can do their jobs. AI tools allow journalists to find interesting stories, sample data, and generate stories from it. This is becoming increasingly important, reaching above efficiency and speed, Chua said.

Either way, ‘we are still in an early stage’, said Gibbs. ‘We have still a long way to go to get rid of routine tasks to make room for other more creative tasks’.

‘Some people will use AI to reduce newsrooms’, Reginald Chua said, and while that’s a risk, ‘others will use AI to develop entirely new types of reporting, such as personalisation.’

Gonzalez-Franco, a researcher on perception and cognition, highlighted the theme of personalisation. ‘The best thing is to deliver personalisation’, she said. ‘And we need a lot of computing power for that. Correlations across very different and equally massive sets of data demand a supercomputer.’

Democratise AI in the newsrooms

For Gibbs, in spite of all the capabilities of tech to benefit the workflow of journalists, there is still a challenge in getting those technologies to smaller newsrooms.

‘Associated Press has been producing automated journalism around financial earnings and sports scores for four years. And that is still what people question us most about! We have a long way to go’, Gibbs said.

AI technologies are being democratised, says Reginald Chua

According to Chua, technology is becoming democratised. ‘Think about graphics! These days there are all sorts of tools out there to produce graphics in the newsroom. That will continue. The tools will come to people.’

For Gonzalez-Franco, identifying who is consuming a media’s content is probably the biggest challenge AI can overcome.

‘We will be able to identify precisely who is consuming our content. This will happen to all organisations. One consequence is that ads will be better targeted, but content can also become more meaningful for each user. That’s going to be a big deal for news organisations.’

But, are we sure we can trust automated systems?

The question of whether we can trust automated systems to produce reliable and unbiased work was the elephant in the room. For Chua, we’re still in the early stages of these technologies. We need to raise the standards and explain the methodologies, disclosing eventual biases, so that people are comfortable with the outcome. But, of course, ‘we should also worry that these technologies can be weaponised to produce misinformation. After all, these are tools that anybody can use.’

Verification is a challenge and an arms race, according to Lisa Gibbs

Gibbs agrees, ‘Verification is a challenge, a sort of arms race between those who want to to use these technologies to augment the newsroom and those who want to take advantage of them to advance their own agenda.’

Image and video creation tools, in particular, are becoming very scary. At AP, Lisa and her team developed a solution to use on social media to stop misinformation at an early stage. But that is just one tool and others like it are needed.

‘The banking industry runs on trust’, Gonzalez-Franco pointed out. ‘Using large amounts of human behaviour data has been significant to achieve better results on this issue. This has already happened in other industries. And it will explode.’

‘There is always a flip side’, Chua concurred. ‘Governments and corporations will know more about you, and that can be scary.’

The ethical side of the issue

Explaining how these technologies work and making them transparent seems to be crucial. ‘This is an evolving area and human journalists will have to do a lot of explaining on how they are using these technologies,’ said Gibbs. Chua agreed, ‘Still today, Photoshop generates a certain amount of mistrust in newsrooms. It’s the same now with AI. If one of these systems produces, for example, a credibility score between 75 and 85 percent for a given story, a journalist does not need to know statistics to understand that result. We will reach a point of probabilistic truth. People will have to get used to that’.

Technologies will continue to empower journalists, says Mar Gonzalez-Franco

‘I think we should talk more about how we use these tools when we talk about predictive results’, said Gibbs. Gonzalez-Franco agreed that computer correlations that allow predictions on human behaviour are going to be inevitable. But the truth is that predictions can be wrong and that is a big danger for journalists. The same issue exists with content generation: computers can generate content that looks real even if it isn’t. Using tools to verify that is going to become increasingly more difficult.

‘At this stage we can use these tools to make two different sports stories, one from the winner’s side, another from the loser’s side. But should we do that?’, Gibbs challenged the audience. ‘This is the kind of question we have to address’, she concluded.

‘How should journalists better serve the public, that’s what we should ask’, said Chua. ‘Journalism is also a product of the available technology. This is the time and place to think about these fundamental issues. How will the technology of journalism evolve with all these tools that are emerging?’

A glimmer of hope

In the end, is there hope to counterbalance the risks associated with the adoption of AI in the newsrooms? The session ended on a positive note. Gonzalez-Franco is optimistic, ‘Technologies will continue to empower everybody and also journalists. For every newsroom, either big or small. It will have big impact on personalisation of news but also on the business of ads. We are all going to be more empowered, I think.’

Lisa Webb also considered that technology will allow journalists to do a better and richer job in newsrooms. ‘There is a direct correlation between automating some news coverage and being able to improve on the investigative journalism happening in newsrooms. There are multiple examples of that. I am confident that, as innovations scale, journalists will be able to do more journalism.’

‘Of course there are people who will misuse these technologies and therefore we have to be vigilant’, Chua concluded. ‘But there is potential to do so much more. Historically, journalism has failed to adequately serve some audiences and sectors of society. The input of technology will allow journalism to reach underserved communities and audiences that could not be served by the mass media.’

Technology is neutral. What is not neutral are the uses and misuses it can trigger. This discussion about the pros and cons of AI in the newsroom reinforces that idea. The tools are there (and will continue to be developed) to allow journalists to do a better job, either by suppressing inefficiencies or augmenting capabilities. That job should be done, even if some actors take advantage of those tools for more dubious ends. It’s an arms race, to use an expression that was frequently repeated in this session.