As AI takes over, how extreme are the risks to journalism?

By: Seamus Bozeman

Samuel Valencia
Spotlight
7 min readJun 24, 2024

--

Illustration of robot hand holding items relevant to Journalists by Dante Estrada

On the eve of the Writers Guild Strikes in Spring 2023, John Horn, an entertainment correspondent with NPR’s show 1A, gathered a list of questions to ask screenwriter and director Nicole Holofcener in an interview.

“You seem to make movies a lot about personal loss, why is this an important thing to you?,” he asked.

“Your films often deal with complex relationships and human emotions. The film has a beautiful and evocative score. How did you work with the composer to create the right tone and mood for the film?”

The only catch was that every single one of these questions was written by an artificial intelligence (AI), prompted by Horn and read out. She said many were so good that they fooled her.

The AI that wrote the questions will never be able to drive to a source’s house, get out of the car, and place a tape recorder on the coffee table to record the interview it created the questions for, a raw human element that will be lost when and if AI fully takes over the jobs of journalists. Horn also points out that an AI can write really good questions. However, AI cannot have a “hunch” on what the story will be about or decide what the next question should be or even the best parts of an interview to use in any given story.

Before services like ChatGPT and other AI’s came out, far fewer industries, including journalism, faced far less of a threat of being taken over. However, AI’s threat, coupled with significant cuts to advertisers, hedge funds, and mass layoffs, has placed an increasing strain on the future of journalism and how much of an impact humans will have.

“To think about media history in general that every new technology creates this sort of panic sometimes called a moral panic,” Elizabeth Blakey, a cultural sociologist and lawyer teaching media history at Cal State Northridge said. “It really means it’s just a fear of the new. So there’s some people that think AI is going to change everything for the better and some people think that it’s going to drive us off a cliff.”

The implementation of AI in journalism and its ever-increasing influence will make journalists and the work they do even more important, according to a report from Nieman Lab in 2023.

“Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology,” the report said. “And AI will surely shake the world in ways we can’t yet imagine.”

AI and its increasing prevalence in journalism will also lead to continued distrust in media. However, according to Horn, it won’t worsen mistrust, and it certainly won’t make it any better.

“So in a way it’s not that they’re independent of bias,” Horn said. “They’re just incorporating those biases into the way they think and ask questions or write stories. But to me the real danger is not human choices. There’s something to be said about how a human mind makes decisions and weighs facts [and] opinions [and] is able to come up generally with an objective way of telling the story. You don’t know what biases an AI program might have so the possibility for accelerating mistrust in journalism is certainly present. I don’t know if it accelerates it, but it certainly doesn’t stop it.”

In an experiment during our interview, Horn typed in the prompt “questions for student Seamus Bozeman.” The chatbot spat out a version of a letter of recommendation I’d received from a professor just a few months ago, exemplifying how far AI’s Large Language Models (which predicts and generates the text the bots spit out) and AI’s analysis of the internet have come with each new version and competing models.

“One of the most striking aspects of Seamus’ approach to learning is his willingness to challenge conventional wisdom and explore new ideas,” The Good AI said, a competitor to Chat-GPT. “Whether it be in the classroom or during extracurricular activities, Seamus is always eager to engage with complex concepts and push the boundaries of his own understanding.”

The AI used in the test in further research appeared to have little on its back story and what information from the internet The Good AI is using to train its internal models on.

With access to every corner of the internet, AI can do that for nearly anyone with any publicly available information out on the internet that it can feed off of. However, several media sites, including the New York Times and other publications, have sued OpenAI for using their sites for their training models without any compensation.

In journalism, AI has tried and failed to write stories that could fool human readers. Many of the articles published by Gannett and later removed in summer 2023 were full of easily spotted errors and rudimentary sentences based on the box scores. One of the first sentences of one of the many examples said: “The Worthington Christian [[WINNING_TEAM_MASCOT]] defeated the Westerville North [[LOSING_TEAM_MASCOT]] 2–1 in an Ohio boys soccer game on Saturday.”

Another limit and danger to outsourcing sports to AI writers like Gannett did is that the real elements and emotions of sports are lost, according to Blakey.

“You realize it’s probably written by a machine and you might think, oh it’s easy to automate local sports,” Blakey said. “But local sports is so rich with culture and personality that you’d want to have a reporter who knows the players, who knows the rivalries, who knows the meaning behind things. It’s not just the scores.”

Another notable example, according to Horn, is when Sports Illustrated, a once “revered” publication in sports, posted stories very clearly by bots and under fake bylines. In one example, the user “Drew Ortiz” did not appear to have any publishing history and the profile picture appeared to be from a site that sells AI generated headshots, according to reporting from Futurism.

However, the challenge comes later as AI develops further and begins to correct these mistakes seen in the earlier versions. As the models improve and continue to learn from human data and their errors, it will be more challenging to teach AI literacy in journalism and other fields as the ability to tell the differences between humans and AIs will rapidly decline, according to Horn.

“Right now you start teaching it, you can spot it if you read closely enough,” Horn said. “But that’s going to be fleeting. And in a couple of years, you can teach all the AI literacy you want and it may be too smart for anybody to detect it.”

The Associated Press (AP) has been using AI since 2014 to write quarterly financial reports for publicly traded companies. However, The AP expanded in 2023 to use AI for transcription services, harvesting and understanding social media trends, automating headline generation, and other background tasks.

Like the AP, several news organizations from across the U.S. and Europe have begun to adopt AI regulations in their own newsrooms, from only allowing AI in very specific uses to being completely outlawed, according to a report from Neiman Labs.

However, a Muck Rack report on the state of Journalism in 2024 said that journalists “are slow to embrace generative AI” and that 31 percent intend to not use the technology at all in their reporting. The 48 percent who do plan or are using AI in their work cited several reasons for the technologies to be useful.

“Of the journalists already using or planning to explore the technology, most want to use it for tasks like brainstorming and research assistance,” the report said. “Writing any public-facing copy is a less popular use case.”

Because of background tasks like research assistance, drafting outlines, headline drafts, and writing early versions of story copy that Muck Rack highlighted in their report, AI assistance allows reporters to focus on the human-oriented tasks of journalism like interviews, radio, podcasts, and giving stories a human element.

“If AI can save enough money to keep a newspaper publishing or a radio station on the air, that’s not a bad thing,” Horn said. “But is it a good thing? If the AI tool has checks and balances, perhaps. But what if the checks and balances person was laid off? Or there’s simply not enough time? I guess part of me would rather have a newspaper go out of business or a radio station shut down rather than keeping going with untrustworthy content.”

In the same report from Muck Rack, it said that almost 60 percent of the newsrooms polled did not have an “AI case use policy.” Forty-five percent of those newsrooms said that they would not explore policies to reign in the use of AI in their newsrooms and would leave it up to the reporter whether or not to incorporate it into their everyday reporting.

“If I were in a newsroom I would be more concerned about job loss than the perils of AI,” Horn said. “But, we’re kind of in the early stages of just starting to understand what AI can do.”

In one of several examples to boost collaboration between AI and local news, OpenAI gave more than $5 million to the American Journalism Project (AJP) in July 2023 to “strengthen” cooperation between AI technologies and local media.

Despite the ways that AI could improve journalism and bring positive changes to a flailing industry, its dangers remain visible and will continue to worsen not only in removing a human from the reporting process but also in not having stories written at all.

“Our understanding of the world is through interactions, through questioning people, observing things, coming to our own conclusions,” Horn said. “And that’s not only what journalists do, it is what humans do. And if you remove the human part of that, from the journalist or kind of learning question, I think you’re in a really bad place.”

--

--