Cracking the Code: A Science Media — Research Collaboration:

Reporting and dissemination: Reaching audiences

This article is one of a multipart series exploring the unique media practitioner-academic research collaboration of Cracking the Code: Influencing Millennial Science Engagement (CTC) a three year Advancing Informal STEM Learning Innovations (AISL) research project funded by the National Science Foundation (NSF) between KQED, a public media company serving the San Francisco Bay Area, Texas Tech and Yale universities. KQED has the largest science reporting unit in the West focusing on science news and features including their YouTube series Deep Look.

The author, Scott Burg, is a Senior Research Principal with Rockman et al.

Cracking the Code (CTC) has probably generated more research reports than most NSF-AISL studies. Over three years, the teams from KQED, Texas Tech and Yale universities and Rockman et al (project evaluators) have published close to 30 studies and blogs addressing topics such science identity, women’s science viewing preferences, video audience and gender disparity, use of science news headlines to maximize viewer engagement, two national surveys on millennial science engagement, and a process evaluation of this unique academic-practitioner collaboration.

This same collaboration also published and disseminated numerous reports and blogs on findings from an NSF-funded RAPID grant exploring best methods and practices to inform millennials and younger audiences about the science of virus transmission and protection, and best practices for disaster and crisis reporting. Conducting and reporting on this wide swath of research was no easy task, especially when operating during a pandemic. This was a monumental achievement considering the substantial differences and expectations between science media practitioners and science communication researchers as to how research is conducted and reported.

Differences in reporting

Throughout both the CTC and RAPID projects, methods and expectations of report writing and dissemination was a substantive issue for both project teams. KQED is familiar with reporting that is driven by deadlines and is more action oriented. The research team comes from an environment where reporting is predicated by peer review, and careful, often very time-consuming data analysis.

One of the key objectives of CTC was to inform project participants, stakeholders and the general public on progress and research findings through regular ongoing reports. KQED, familiar with this deadline driven report dissemination developed a project website from which reports would be generated. KQED also planned to disseminate project reports over social media platforms such as Twitter and Facebook. Publishing research reports on this scale was something very different for KQED.

These project reports are really about two things, one is it demonstrates that the NSF is looking for intellectual merit, and how this work is actually going to impact what other people are doing in the field. Publishing these reports is about our broader impact. What we’re doing is completely different and new for this particular type of a project, where we’re actually letting the field understand our work as we’re doing it. That’s what I find really exciting and dynamic because that means that someone else could be taking the results and actually applying it to their work and studying it along with us. — KQED science staff

KQED’s interest in reporting for public and practitioner audiences (in more of a real-time mode) required a delicate balance and negotiation in defining results from both an applied and research perspective, as well as how those results would be interpreted. This was very different from publishing in academia, which is generally more geared toward professional academic audiences.

At times, KQED staff felt that reports generated from the research team were often too dense and complex, making it difficult to distill a study’s key takeaways. Periodically, there would be disagreements as to what a study’s key takeaways actually were. Some of the writing was “overly academic” in a manner and format that KQED staff were not experienced with.

In this context, it’s important to note that science journalists are often in the position of translating complex scientific topics and research to the public. They are especially keen to simplify and make complex ideas more accessible. One suggestion for facilitating the report review process was incorporating a top-line review of findings with each completed report.

There were times when it was easy to get confused reading the reports. There wasn’t a roadmap for the report shared with us in advance that said, here are the highlights of what we’re going to talk about in the report. I think that before a report even begins to be started, there should be an agreement as to what the key points are. — KQED science staff

From KQED’s perspective, it was their role to vet and decipher research reports for the public.

I really feel that our (KQED) role in the reports once you’ve written it up is to make sure that it’s something that our field would understand. — KQED staff

To achieve this objective, it means publishing data that may not be conclusive, but indicative of an exciting trend or methodology. This approach was at odds with the more finite, conclusive and validated approach of academic research publications.

In reporting to other media practitioners and the public, we can say that while we may not have come to a final conclusion about something, we made some assumptions and maybe they were wrong. That way practitioners can learn from that particular experience. Whether they actually then take the data and try to replicate it and do something is out of our control, but they’re observing a process which to me is just as valuable as whatever the outcome is. — KQED staff

Members of the research team were not as comfortable with this approach of reporting. They could not understand why the project would want to publish “partial” findings.

From the research side it’s like, “Here’s what we wanted to do, here’s what we did. Here’s what we found. Bye.” I think the idea of sharing is good but what we should do is share what we do in the normal course rather than produce something that we would showcase to others as, “Oh here’s something that you can learn and use in your practice.” I think that we’re not quite there yet. — Research team

The research team heard these concerns about reporting, and tried to employ a number of different methods to satisfy KQED’s needs for more regular public-facing reports, without overly impairing the more formal methods necessary to carry out their work. Members of the research team acknowledged that the type of research that they were conducting looks at trends, and as such takes a long time to think about and interpret. One researcher pointed out the difficulty in rushing to conclusions due to the type of research being conducted, while at the same time being hampered by the process of preparing a report to meet KQED production specifications.

One of the important things in our work is not releasing information until we know what it means. The type of research that we’re doing takes a long time to think about and interpret. We’re trying to understand a phenomenon that is not necessarily something that’s occurring at one moment in time, but something that’s underlying a longer period, like a gender gap and engagement with digital video about science. But then once we know what it meant, it sometimes still took months to get something through the KQED report process. I felt like it was like a 12 step process of getting anything that we reported out to the team into a publishable report format, where we had copy editors and designers. There were like 25 different hands on it before it was even shareable. — Research team

Solutions and a model for NSF

While not fully resolving these differences in reporting methods and expectations, the project teams developed methods for satisfying KQED’s interests in interpretation of complex results, and the researchers interests in allowing the research process to play out before prematurely releasing findings. All formal CTC research reports are accompanied by a jointly written blog article (between KQED and researchers) on social media that distill research findings for public and practitioner audiences. Draft reports are vetted and reviewed by all members of the CTC project team to ensure accuracy and inclusion of perspectives from both the researchers and practitioners.

This past summer and fall, CTC team members also jointly presented a series of webinars on different facets of the research and evaluation to project partners, science communication research and practitioner stakeholders, and KQED science and news staff. Webinars for the public are planned for the future.

This process is fairly unique for NSF projects, where too often research findings are not accessible (both physically and cognitively) to audiences most impacted by the research itself. The preponderance of federally funded research reporting is generally geared to professional audiences through peer-reviewed publications or conference presentations. CTC demonstrates how diverse teams can work together to effect change on how research is conducted, interpreted and disseminated. This is a model that other NSF and other federally funded projects should pay attention to, especially with the increasing focus on more equitable inclusion of interpretation and dissemination of results to diverse audiences

Project participants also noted the unique advantages that this collaboration provides over more “traditional” industry reporting or research on audience engagement.

From someone who spends a lot of time in engagement it’s really exciting to not be across the table from a tech company. It’s really empowering and hopeful for me as a journalist to be able to say, “Oh someone can tell me about engagement and it’s not going to be someone from Google or Facebook.” At a convention I can collaborate with researchers and they can give me useful information too. That’s really powerful. — KQED staff

--

--