Three source diversity audits, three lessons learned (at least!)

Eric Garcia McKinley
The Impact Architects
5 min readSep 14, 2021

Over the past year, Impact Architects has worked on three substantial source diversity audits, with more on the way. These projects, coupled with our previous experiences as staff inside of newsrooms at the Center for Investigative Reporting and Minnesota Public Radio, have taught us a lot about effective methodologies and uses of the audits, as well as how to ensure these audits contribute to longer term culture and practice change inside of organizations.

As we refine our processes and look ahead to continue this important work, we’re taking a moment to consider and share some of the lessons we’ve learned along the way.

At IA, our audits have three main features.

  1. Our audits have been retroactive. We have looked backward in order to create baselines and benchmarks for the newsroom we’re working with.
  2. We create representative samples of content. There’s no need to conduct a comprehensive analysis when a solid, randomized sample will do.
  3. Lastly, we code for multiple variables. This feature most distinguishes our approach from many of the other publicly available source diversity audits. We generally include eight or nine variables for each source we code, and we’ve even developed new variables based on newsrooms’ specific questions, needs, and goals.

Through our first three large-scale audits, we’ve learned (at least) three big lessons.

Lesson 1: Analyze the intersections of variables — not just the top-lines.

A simple breakdown of, for example, gender distribution can tell a newsroom a lot about its sourcing practices. One of the big takeaways from our work with KQED — which included the newsroom and podcasts — was that its gender distribution was split almost exactly 50/50. Out of the 1,635 records we coded for, we identified 50.6% as men and 48.9% as women. This was a highlight of the audit and something the station could rightly consider a success, overall.

However, a more nuanced picture emerges once we looked at how gender distribution intersects with the other variables we coded. For instance, the four racial categories that had the most records were white, Black, Hispanic/Latino, and Asian. The gender distribution for these groups is not reflective of the overall numbers. Each population of sources was either disproportionately men (the white and Asian sources) or women (the Black and Hispanic/Latino sources). The overall numbers were telling regarding overall gender equity, but the intersection of race and gender revealed imbalances that can lead to questions about what stories the newsroom might be missing.

Because we code for so many variables, we’ve been able to unearth these types of insights for many other intersections, such as gender and profession, race and geography, age group, and story topic.

Lesson 2: Spotlights and opportunities, rather than successes and failures.

In our analyses and narrative reports, we avoid the language of “success” and “failure.” Instead, we “spotlight” results where newsrooms’ practices appear to be effectively bringing diverse perspectives into content, and we highlight “opportunities” where there are potential blind spots in sourcing, and therefore opportunities for new stories to be told.

We’ve noticed that we often find opportunities in spotlights, and spotlights in opportunities, demonstrating why it’s so important to analyze audit results in depth to get the most out of them.

For instance, we’re currently completing an audit for KUOW, a public radio affiliate in Seattle, Washington. One of the most obvious spotlights in our analysis is the percentage of Black sources in KUOW content was more than double the percentage of local population estimates in Seattle. This is a clear spotlight because it shows the station is equitably including the voices and perspectives of Black residents. However, opportunity arises when we drilled down to identify the topics Black sources are associated with: In this example, 75% of all Black sources appeared in two topic areas, Arts/Culture/Sports and Racial Justice. While representation of Black sources was high overall, it was limited in scope.

Lesson 3: Tracking for profession is easier — and more valuable — than it seems.

“Profession” is one of the variables we have tracked in our audits, and it’s also one of the most analytically useful. Profession is fairly easy to track. For journalists, identifying a source’s profession in a story positions the person in society, which in turn orients the audience and serves as a proxy for other characteristics that are much more difficult to capture: Namely, social and economic status.

In general, we use 10 variables when coding for profession (the list grows, shrinks, and is adapted based on experience and client need). They are:

  • Academic (non-legal and medical fields)
  • Artist (including authors)
  • Business employee/spokesperson
  • Community/advocacy organization representative
  • Elected government official
  • Journalist
  • Lawyer
  • Medical professional
  • Non-elected government official
  • Student

A good example of how this category has proven useful in an audit comes from another west coast public media station, CapRadio in Sacramento, California. (Side note: We’d love to work with public media stations from other parts of the country!) Of the 503 unique sources we coded for, about 15% were community/advocacy organizations and another 15% were business employee/spokespeople. These were the two highest percentages. To us, that demonstrated an emphasis on “on the ground experts” rather than institutional experts. It sets up the newsroom to ask what strategies and tactics are in place that led to this result, and how those might be used or repurposed to address other opportunities that might have been found in the audit report.

Conclusion

The common feature in our lessons learned is that drilling down below top line findings to second and even third levels of granularity provides both critical insights and opportunities. These insights are often those that spark the most robust conversations when newsrooms are deciding how to develop plans for action based on the audit results.

As we continue to work with more news organizations with varying audiences, business models, and platforms, our source and content audit processes are evolving. We’re currently improving our data presentation, including interactive databases for newsrooms to dig into the data themselves, and refining our methodology based on what we’ve learned. As we continue this work and expand it beyond public media, we’ll surely have more and newer lessons. We’ll tell you about those, too.

--

--