Five ways AI can help journalism

Starting from misinformation and media literacy

Martina Andretta
Hacks/Hackers London
5 min readDec 13, 2019

--

Jarno M. Koponen, Head of AI and Personalisation at Yle Uutiset News Lab, during his talk at Hacks/Hackers London (Photo by A. Mascellino)

Many newsrooms and news agencies are employing degrees of automation, either in newsgathering, production or distribution. Some organisations are using tools to mine data, produce reports, transcribe speech-to-text and make their work easier. Although some people in the industry are still reluctant to adopt AI, worried to be made redundant by an algorithm that writes articles and even poems, new technologies offer some possibilities for journalists.

As Polis Director, Prof. Charlie Beckett says, “the robots are not going to take our jobs, sadly I don’t think AI is going to somehow save journalism either”.

Increase journalists’ credibility

On November 18, LSE media think tank Polis and the Google News Initiative published a global survey on how 71 newsrooms across 32 countries use AI. Hacks/Hackers London hosted the presentation of the survey’s results, together with a discussion on what role AI will play in the future of journalism.

Beckett said AI could help journalists prove their value in the era of “fake news”:

“We all know that algorithms, AI, technologies are actually quite responsible for the accelerated spread of so-called ‘fake news’ and disinformation, and yet we also know that the same technologies can be used in quite specific ways — such as automated fact-checking — to counter it.”

— Charlie Beckett, director of Polis at LSE

“Generally, what is interesting is how throughout the newsroom AI can help build the credibility of journalism and journalists,” he said.

Polis’ Journalism AI report includes several ways in which journalists are building trust with their audiences, such as claim detection or bias identification.

One of the respondents to the study explains: “We use machine learning to help us separate claims from other sentences. This helps our fact-checkers in deciding what to check each day. Robochecking is automatically checking claims against databases of information. We use the Office for National Statistics data to check claims in real-time. For example, if someone says ‘Employment has fallen by 10 per cent since 2016’, our system finds the correct numbers and generates a graph, and answer, in real-time.”

Watch Charlie Beckett’s talk at Hacks/Hackers London

Raise public awareness

Journalists could use AI to develop tools that show how misinformation works. An example of this is Troll Factory, a game by the Finnish national broadcaster Yle, which focuses on how misinformation is spread via social media.

Troll Factory shows how “fake news” is used to influence opinions, Jarno M. Koponen, Head of AI and Personalisation at Yle Uutiset News Lab, explains: “It promotes digital media literacy, by putting you in the centre of the action.”

“You become an internet troll. In a game-like environment, you can start to spread fake news, conspiracy theories, use botnets, targeted advertising, and malicious memes to do this,” he added.

Koponen says that “the game itself is just the tip of the iceberg” as the team behind the game decided to use real-life examples of content that are being spread over the internet.

The results are tangible: 70% of those who start also complete it and users spend an average of 6 minutes playing. The game was created to show people how misinformation spreads and it was realised “bringing journalists, designers and data scientists into the same room”.

“If we do not truly redefine our core business, redefine who we are, we cannot really have a say in a world that is affected by AI and machine learning”

— Jarno M. Koponen, Head of AI and Personalisation at Yle Uutiset News Lab

However, he warns that machine learning has to always be treated as a means, not an end: “It’s not AI-first, it’s journalism-first.”

If given the right tools, journalists could come up with interactive, creative solutions to show — in the easiest possible way — why misinformation is a problem for society as a whole and why newsrooms need the public support.

Watch Jarno M. Koponen’s talk at Hacks/Hackers London

Broadening skillset

Introducing AI in newsrooms means introducing new tools. However, in this case, the required skill set is completely different from the one of traditional journalists. Many reporters haven’t been trained in coding or understanding data.

Knowledge and expertise are lacking: “You cannot clean data if you don’t know what the data is about […], or what is the angle of the data that is relevant, or that we all should care [about],” said Claudia Quinonez, Head of News Automation at Bloomberg, during a panel at Hacks/Hackers London.

Unsurprisingly, respondents to the report said that the biggest challenge to adopting AI were resources and knowledge or skills — or lack thereof.

Watch the panel “AI and Future of News” at Hacks/Hackers London

News personalisation

As the Journalism AI report points out, news personalisation can cause confirmation bias — the tendency to engage with content that supports one’s existing beliefs or theories.

Many respondents to the report also highlighted this concern. One of them said that algorithms may end up doing a disservice to the audience: “There are currently many cases of bias in journalism that come with the use of AI,’’ one anonymous respondent said. “Poorly trained algorithms may do more harm to journalism.”

However, when applied to news personalisation, “well-managed AI can help counter misinformation, providing a balanced mix of sources and stories”.

For instance, looking outside of the organisations that took part in the survey, Nobias, a New York-based startup, declares to use AI to track political bias in online content. Its browser extension tracks bias at the article level, provides information on the credibility of the source as well as analysing the bias of outbound links on the page.

Last year, Google updated its Newsstand app to deliver readers a mix of “personalised” news and editorially-picked stories of the day.

Social media verification

FullFact, a UK independent fact-checking charity and one of the third-party organisations involved in Facebook’s initiative to fight disinformation, is employing machine-learning-powered tools for automated fact-checking.

For the 2019 UK election and the 2020 US presidential election, Facebook has set up a dedicated operations centre “to monitor activity, remove fake accounts and reduce the reach of articles that have been debunked by fact-checkers”.

***

Want to join Hacks/Hackers London? Check our website or subscribe to our newsletter for future events and info about our guest speakers. You can also watch the talks from previous meetups on our YouTube channel and follow us on Twitter and LinkedIn.

Are you interested in volunteering and writing for Hacks/Hackers London? Get in touch here.

If you want to know more about Journalism AI, do not hesitate to get in touch with Mattia Peretti at M.Peretti@lse.ac.uk

--

--

Martina Andretta
Hacks/Hackers London

Journalist and news producer interested in digital storytelling, politics and innovation. https://www.linkedin.com/in/martinaandretta/