How do we measure the "real-world" impact of journalism?
Powerful online tools such as Google Analytics and Chartbeat have made it easy to know in detail what people are viewing a given website, how many and for how long.
This has been invaluable to news organizations, as measuring the online audience with metrics such as page views and “Facebook shares” can inform editorial decisions and help to sell advertising.
But for the growing number of nonprofit news websites, analytics that show only views and referrals are proving inadequate. To make the case for funding through donations, grants and crowdsourcing, organizations such as ProPublica and The Center for Investigative Reporting (CIR) need to prove that their work is having an impact in the real world, making people’s lives better.
Luckily, new tools to measure qualitative impact are starting to emerge. Chalkbeat’s MORI, the Media Impact Project Measurement System from USC Annenberg and the open-source NewsLynx are just some of the attempts on this field that appeared in the last couple of years.
They all work in different manners, with some common traits. Most track mentions, social comments and shares, but also related news around a reported piece — like a tailored Google Alert that follows a story for days, or even months.
When perfectly set, the alerts will show when a different media outlet links to a story, furthering the investigation, or when a new law dealing with the issue reported is approved. In a sense, these tools track or make it easier to do follow-ups. All they need is to combine different signals of impact. This video from USC Annenberg puts it nicely:
This is promising. But talking with people involved in the development and trying my hand with some of these tools, it’s safe to say that the quantitative-analytics solutions are clearly in their early stages.
While some functions of these new metrics tools are automatic, for now they all need a great deal of editorial oversight to set the correct keywords and metrics, as well as typing new information in never-ending fields, cells and spreadsheets.
This labor-intensive way to deal with impact tracking can be seen as a bug. But maybe it’s a feature.
Take MORI, which stands for Measure of Our Reporting Impact. Chalkbeat’s solution works directly in the content management system (CMS) of the news organization dedicated to education. MORI is a Wordpress plug-in that, in a basic level, creates additional fields for a reporter to fill before and after publishing, such as its theme and intended audience:
(You can learn how MORI works in detail here)
After the article is published, the reporter is encouraged to get back and write what “informed actions” were the result of the story and if any “civic deliberations” related to it has taken place.
“Philosophically, MORI wasn’t hard to sell internally”, says Anika Anand, Chalkbeat’s director of product. “Reporters work here because they are mission driven. They want to know if their work matters.”
Logistically, there are still challenges, as reporters need to assign time to put back new information on the story and to do follow-ups periodically. But Anand says that thinking about the desired audience and future impact beforehand makes reporters change their approaches to stories.
She tells me that a couple weeks after the tool was launched a reporter asked: “Is this [pitch] MORI-able? Is there impact?”. A good sign of a technological solution helping to positively shape behavior.
It is important to note that Chalkbeat is “not advocating for anything”, says Anand. The goal is to provide quality information to foster debates and better decision-making around education issues. So when a parent prints a Chalkbeat article and brings it to a school meeting (a real, recent example), that goal is accomplished.
But to track that type of impact a reporter needs to keep a conversation with the community, checking back from time to time. Creating additional fields in the CMS helps, of course, but everything must be aligned: the newsroom culture, the mission-driven organization and reporters really connected with stakeholders in their respective communities.
“The problem is that (mostly) non-profit newsrooms are required to measure ‘impact’. And that means different things to different people.”
That phrase was in the first slide of NewsLynx’s presentation. The comprehensive impact-measuring tool was created by Tow Fellows Michael Keller and Brian Abelson, and made available in early June. In their paper, as well as in many other articles on the subject, there is a incredible effort in defining what is “impact.”
For NewsLynx, which concentrates mostly on investigative reporting-focused newsrooms, the signals of impact involves concrete actions, usually from the government, like a corrupt official being fired or new regulations passing. But what about the nonprofits that just want to “raise awareness” and change the conversation on a given issue?
“Just because a law is passed, it doesn’t mean it is the end. Might just be the beginning,” said Blair Hickman, audience editor at The Marshall Project, during NewsLynx launch.
“We have a two-fold goal,” she says. “To generate and amplify the conversation around the criminal justice system. And to generate a journalism that has an impact. The audience for those two things can be very different.”
Notice that Hickman separates the “impact” and “generate the conversation” parts of the outcome. That may be because it’s challenging to measure both, at the same time, with the same tools. Or it can be because, well, we don’t know how to read people’s minds.
“A lot of our work is explanatory, so what we care is what happening in reader’s brains,” said Sarah Maslin Nir, the New York Times reporter that wrote the now famous nail salon exposé. “Is it prompting questions like ‘should I boycott?’ ‘Tip more?’”
From anecdotal and commenting section evidence, it is clear that the story had an impact in how people think and behave when going to a nail salon in New York City. The story had also clear outcomes, in a matter of days, as it prompted actions from both Gov. Cuomo and Mayor De Blasio.
But Maslin said that thinking about the impact before making a story is a “slippery slope”. In a classic “a journalist writes what needs to be written” fashion, she punctuated the difference between nonprofit, mission-driven newsrooms, and businesses, like the New York Times. “I don’t understand getting so deep in the mind of the reader. Why? What do you find? Is it going to change your journalism? That can be dangerous.”
At the same time, she acknowledged that the “impact” signals — like putting an issue in politician’s agendas — are exactly what the Pulitzer committee takes into account, particularly in the Public Service category.
It’s safe to say that every journalist wouldn’t mind a Pulitzer in his or her desk. So why the resistance in thinking about impact-driven stories?
“Within the field itself, if reporters crow too much about the success of their stories in regard to ‘impact’ and ‘change,’ they are criticized and professionally labeled as a shill for a cause, an ‘advocate,’ wrote Charles Lewis and Hilary Niles in an important 2013 paper about impact measurement.
This is probably a topic for another time, but advocacy is still a dirty word in journalism, and maybe it shouldn’t be, as Jeff Jarvis, Josh Sterns, Shawn Burns, Jan Schaffer and so many others have argued recently.
Getting back to the start, developing a good tool to measure the “journalistic success” will largely depend on how our definitions of engagement, impact or even the purpose of journalism evolve. It is only natural, then, that there is no individual solution for all newsrooms, in terms of tools. The latest efforts show that there are more, and they can be tailored to specific needs.
But before creating impact-spreadsheets or incorporating dashboards and plug-ins to follow a story, one thing should be clear.
Maybe we should start here.
“Setting organizational goals is something ubiquitous in the philanthropy world. It has made its way into the [nonprofit] media,” said Newslynx' Michael Keller.
For philanthropy — he reminds — one can set clear, numeric goals, like lowering the cases of polio in a country. “You literally just need to count, and control trials. But when it comes to measure media impact, the outcome of a investigation, this is a lot harder,” said Keller.
Hard, but not impossible. The first thing is to know exactly what to “count.” There is a “crucial intersection between successful impact measurement and stated organizational goals,” Keller writes in his paper. In other words, if we know what to look for, the greater the chance of finding it.
Keeping in mind, as ProPublica’s Richard Tofel pointed out in a seminal paper on the subject, measuring something as amorphous as “impact” will always be an imperfect science. “While we should insist that impact itself be tracked with rigor and described with the greatest precision possible, we must not fool ourselves into thinking that it is subject to mathematical proof or even, in some cases, statistical reliability,” he wrote.
It’s a thorny issue, but one that the for-profit news sector is likely to start engaging with soon too, says Lindsay Green-Barber, Media Impact Analyst at The CIR. “It’s tough, because people don’t want to pay for news,” she says. In the nonprofit news world, “It’s the same challenge, the sustainability model….I don’t think anyone has cracked that nut yet.”
Green-Barber thinks that in the last couple of years there’s been a renewed interest in developing qualitative metrics in many newsrooms. There is now “a shift of awareness, of understanding impact based in organizational goals. There is a sense of opportunity.”
Recently, a lot of newsrooms are adding “growth editors” positions. But “growth”, in this case, usually means “growing audience numbers”. There is much more that can be measured, and a lot more to be achieved with good journalism. We just need to decide what our goals are.
- A version of this story appeared in an assignment for Indrani Sen reporting class at CUNY J-School.