Credit: VRT NWS

10 things about AI every newsroom should know

Preparing your newsroom for the artificial intelligence revolution

Published in
10 min readDec 7, 2018

--

Not a day goes by without a new headline promising that artificial intelligence will either find a cure for cancer, steal our jobs or perhaps eradicate the whole planet. Still, many journalists have little knowledge of what AI actually means, let alone what it can do for a newsroom. Is the sense of optimism or fear correct, or is it misplaced?

The proposal I’m working on as a John S. Knight Journalism Fellow at Stanford University is leveraging AI to detect deep fake video. Part of my strategy was to spend the first quarter of my fellowship ‘demystifying’ AI. I took a hands-on approach by joining classes at the Computer Science department. When I heard that I would be able to build an artificial deep neural network by the end of one of the courses, I felt a shiver of excitement down my spine.

First steps in AI at Stanford University

AI in its most basic form is a system that makes autonomous decisions, performing tasks that mimic acts of human intelligence like solving problems, understanding language or recognizing sounds and objects. To help computers perform their specific tasks, they need a set of step-by-step instructions that tell them what to do: an algorithm. To feed the algorithm, data is the basic ingredient for every AI.

Most journalists are probably not aware, but AI is already affecting the news value chain, from news gathering to content processing and distribution.

Together with other JSK Fellows interested in AI, we’ve created a working group to explore the possibilities and challenges of AI in newsrooms and to learn about specific tools. So we invited several experts to the “JSK Garage,” our seminar room at Stanford. Below are some takeaways from my first months of research.

1. AI is already automating the newsroom

By using machine intelligence in the newsroom, we can increase efficiency in our creation processes. Since journalists have to serve different audiences on different platforms, it’s not unusual to hear them complain about repetitive, monotonous tasks they have to perform. These tasks can be reduced by AI.

A lot of media companies are already using Natural Language Generation (NLG), for example, to turn structured data into a written story that is often indistinguishable from one written by a human author. The Washington Post is using Heliograf and China’s Toutiao has developed Xiaomingbot.

Other news organizations like Bloomberg and the Associated Press are using Wordsmith by Automated Insights, a tool that mines data and is capable of producing reports about sports results and company earnings. AI-generated smart templates help them to produce this “commodity news” on a much larger scale than before, up to 12 times in the case of AP.

2. AI is augmenting journalists

There is a big concern that using machine intelligence will demote the role of journalists to database managers. Experiments and implementation of AI in large news organizations like the AP show the opposite is true, freeing up to 20 percent of the journalists time. This way, they can focus on content and spend more time on their core expertise: reporting.

Many examples of augmentation are already available. Speech-to-text-powered transcription tools like Trint or Recordly can make our lives so much easier, instead of spending hours transcribing interview recordings. Companies like Clarifai or Vidrovr are using computer vision to automatically recognize content in photos, tag it and find similar concepts, speeding up the workflow of image editors.

Cortico, a nonprofit in cooperation with the MIT Media Lab, is building a system that brings under-heard community voices, perspectives and stories to the center of a healthier public dialogue. Journalists can use this to produce stories highlighting common ground between citizens with differing political views. Another example of how AI can help us ensure to do more inclusive and diverse reporting comes from The Financial Times. It’s using a bot that is warning the newsroom when it has been using too many male voices, for example.

Demonstration of Graphext in the JSK Garage

The production of video news packaging can also be automated by AI. The Israeli company Wibbitz created a text-to-video platform using image recognition to create videos that can match with a text automatically, speeding up rough-cuts that can later be refined by human editors. Again, this tool is augmenting, not replacing traditional video production. Tools like Newswhip or Graphext can help us find news topics, by applying machine learning to social media data. Reuters’ News Tracer is analyzing tweets in real time and has already uncovered several stories before other media reported them.

3. AI is creating new forms of investigative reporting

AI can be leveraged to power new forms of reporting, by analyzing data and identifying patterns at a much larger scale than ever before. For example, it can take investigative journalists days, months, or years to sift through millions of documents in a corruption investigation.

With Natural Language Processing (NLP), they can go through these documents, discover entities from the data and the relationships among them. It is effectively playing the role of a magnifying glass that can see things that are not visible to the naked human eye. This way, AI can surface stories that have not been told before.

AI itself could also become the subject of new reporting. As algorithms will increasingly influence our lives, investigating algorithms will help us surface new stories.

News organizations such as ProPublica or the newly founded The Markup report on algorithms and how they make decisions that affect our lives for years.

Journalists have already uncovered bias and discrimination in algorithms, for example in calculating car insurance premiums, selecting candidates for job openings, or even convicting criminals. Algorithmic accountability reporting will become a new and important branch of investigative reporting.

4. AI is helping to verify and fact-check

As trust in news has been eroded by disinformation and misinformation distributed on social media, AI can offer a solution to detect and diminish it. Journalists can leverage AI for fact checking and analysis of text, pictures and video.

Think of Chequeabot, a tool made in Argentina, that automatically finds fact-checkable claims in news text and sends them to the newsroom. Similar automated fact-checking applications are Claimbuster and Factmata. The UK-based start-up Logically uses a machine-learning algorithm to combat fake news.

The field has come a long way, but it still faces significant difficulties. The verification of video is an especially big challenge.

There is a sort of arms race between those who want to deploy AI to augment video and those who want to advance their own agenda. Promising tools like Amber, Serelay and Truepic, as presented during a videoconference in our JSK Garage, can help journalists to authenticate photos and videos.

5. AI is creating a personalized user experience

AI can help to serve content tailored to each user, depending on who you are, where you are, or what mood you’re in. By creating unlimited versions of an article, it can help newsrooms develop a more engaged audience.

It can also improve the quality of dialogue with our audiences. Chatbots, like those created by Quartz Bot Studio, allow users to text questions about news events, people or places. They immediately receive an answer back.

The Wall Street Journal’s R&D chief Francesco Marconi showed us how his newsroom is using several interesting AI tools to increase a personalized user experience. A “listening box” for example, developed with MIT, to listen to what the audience has to say, can help journalists to be more in tune with their readers.

Presentation of NewsChat at Stanford University

Together with computer science students in our Exploring Computational Journalism class, I’ve been developing a more natural speech interaction skill on Amazon’s Alexa that can help users navigate through news in a more personalized way. As the synthetic voice of Alexa will become more natural in the future, it will be more pleasant to interact with news, increasing news consumption. Read more about this project here.

6. AI should be transparent to build trust

It’s essential that journalists and news consumers always be informed when a story is authored by a machine or a human. Newsrooms must be open and honest about this. To work properly, AI systems rely on our trust. And to build trust in AI, it’s crucial that it’s transparent and can explain how it made its decisions.

The data itself should also be transparent. AI is not always able to explain why and how it came up with a certain result. There is a growing concern that AI systems are becoming too secretive, making decisions out of a “black box,” where we have no clue how it got there.

Skepticism is an essential trait in every journalist, and it will become even more important when we leverage AI.

Just as verifying the reliability of a news source has become obvious, we’ll have to verify the reliability of any smart machine system. No doubt there will be hiccups when introducing AI, but constant monitoring can diminish mistakes.

To achieve more trust in AI, journalists and users should be allowed to experiment with the technology, by adjusting the parameters of an algorithm to see how the results change, for example. The Wall Street Journal is already testing this, as a way to audit AI.

7. AI is creating ethical implications

AI will not change the fundamentals of journalism, I believe, but I agree that it cannot succeed in a newsroom environment if we don’t apply the same journalistic and editorial standards that have been around for decades.

Algorithms are prone to bias and can make mistakes, just like the humans who created them. But unlike humans, algorithms cannot be held legally accountable. Therefore, it is important that human accountability is embedded in all stages of the content value chain.

As newsrooms implement AI, I’m convinced they should install an ethics board that can develop an ethical checklist for using AI in a newsroom environment.

8. AI is redefining copyright rules

Copyright laws are challenged by the rise of AI in the newsroom. AI is learning from human-created expressive works like articles or videos, each with specific rights owners, to create its own output. This can pose a new conflict, as it can violate the copyright of traditional media outlets. Legality of contents will become an important issue.

The San Francisco and London-based startup Knowhere uses machine learning to write “unbiased” coverage of news stories. The algorithm looks at hundreds of news sources before writing its own “impartial” version of the story. When presenting the tool in the JSK Garage last November, co-founder and editor-in-chief Nathaniel Barling acknowledged that this copyright issue has to be addressed.

9. AI is forcing journalists to refocus and retrain

Because of the rise of AI in newsrooms, the role of journalists will change. We will become more like gatekeepers. But before we can do this, the newsroom management will need to invest more in training its journalists. This is the only way how we can become more at ease operating in a technology environment.

The more we, journalists, know about AI, the more effectively we can use it. Embracing change will help us reap the benefits. Lifelong learning and reeducation will become more important than ever.

There will also be a need to foster closer collaboration between news organizations and the technology industry. Jeff Bezos, founder of Amazon and owner of The Washington Post, understood this right from the start. He hired dozens of data scientists to work side by side with journalists in the newsroom. Collaboration is crucial in an AI-assisted newsroom. Journalists should be encouraged to experiment and share best practices with colleagues, even when failing.

AI will also create new jobs like automation editors, necessary for maintenance and supervision, watching for errors and correcting them. Academic training of new journalists will involve teaching more coding skills.

10. AI is not the way to cut costs and jobs

Living in an era of turbulent news markets, there is fear that some senior newsroom managers might look at AI as a way to cut operating costs and work with fewer journalists. It may be tempting, but I’m afraid that they will be making a big mistake.

As AI is integrated into the reporting process, it will require new investments — not just in technology, but also in people.

Journalists should play an even bigger role in putting the pieces together, verify and add a human touch to stories. Without journalists as gatekeepers with strong news judgment, AI will be doomed to fail.

What’s next?

AI is already changing journalism in ways few people could have predicted a while ago. Having an arsenal of AI-powered tools at our disposal can save time and empower us as journalists. AI is not the new holy grail, however.

We shouldn’t idealize the future of AI in news organizations, nor deny that there will be difficulties in implementing this new technology. Great, innovative ideas are often greeted with skepticism. Change is hard.

Unfortunately, as news organizations grapple with budget cuts and their business model, they have been too slow to invest in AI. Meanwhile, tech giants like Google, Facebook, Microsoft and Apple are already investing massively in AI-startups. News organizations must start developing their own AI-strategy sooner, rather than later. If not, they could be blown away while tech giants take further control of the information flow.

As AI will become the future backbone of every newsroom, I believe we can gain a lot from collaboration and learning from best practices. If your newsroom is already implementing AI, I would love to hear from you. Reach me on Twitter @tomvandeweghe or send me a note on tom.vandeweghe@stanford.edu .

--

--

John S. Knight Fellow Stanford University | Research AI, deepfake & Blockchain | Foreign correspondent VRT | Former Bureau Chief USA & China | author | speaker