Cracking the Code: A Science Media — Research Collaboration:

This article is one of a multipart series exploring the unique media practitioner-academic research collaboration of Cracking the Code: Influencing Millennial Science Engagement (CTC) a three year Advancing Informal STEM Learning Innovations (AISL) research project funded by the National Science Foundation (NSF) between KQED, a public media company serving the San Francisco Bay Area, Texas Tech and Yale universities. KQED has the largest science reporting unit in the West focusing on science news and features including their YouTube series Deep Look.

The author, Scott Burg, is a Senior Research Principal with Rockman et al.

For KQED science staff, their experience on Cracking the Code (CTC) reinforced the reality that audience research takes time. They came to appreciate that while audience research may not always lead to final conclusions, the process itself can serve to build a body of knowledge leading to a greater understanding of how to reach missing audiences; one of those possibilities being that continued audience research may need to be done.

Even with this appreciation of the science communication research process, KQED science staff still wished they had more time to work with the research team to develop methods for translating findings into actionable steps that could inform the day-to-day realities of science production and news reporting. During the early stages of the project, many felt that the “research process” could not keep up with the demands of KQED’s various internal production and reporting time frames. Even after the first few comprehensive and productive rounds of testing, KQED staff felt that they weren’t getting the research results fast enough.

On the flip side, in order to keep up with the project’s ambitious testing timelines, both Deep Look and science news studies went from one research study to the next with little time for discussion in between. There were few meetings between the Deep Look and science news teams, or between KQED and the researchers to fully explore research takeaways.

In that first year we did a massive amount of work, and in a really short time came up with some really strong conclusions about how to actually reach millennials and younger adults. We should have focused on the second year actually using those results to see if they actually worked or not. In other words, the project design wasn’t as effective as it could have been. — KQED science staff

Some KQED staff wondered whether the project may have been too ambitious in the number and duration of testing cycles. Others wondered if the project ultimately overdelivered. What both Deep Look and science news staff did agree on was that there should have been more opportunities for group reflection and collaboration between their respective teams.

Staff noted that they could have benefited if findings, key takeaways, outcomes, and process issues had been shared and discussed more frequently. For practitioners participating in these types of collaborations it does not suffice to only periodically reflect on how research impacts practice. Instead, it is important for all participants to be immersed in a continual reflective practice to strengthen the connections between the theoretical and applied elements of the research. Reflection in action helps to link the academic and practitioner more concretely.

There were times early on I wished that we (Deep Look and science news) could have connected. We’re connected more with this idea of the sense of awe and wonder, because that is an important part of both of our work. I feel like I made several overtures to science news that their research had relevance for us (Deep Look). Maybe we should tie into that. That’s a big part of what we’re doing, but it just didn’t click. — KQED science staff

Staff did acknowledge, however, that conflicts with non-CTC related job responsibilities, lack of adequate backfill staff, and restrictions caused by the pandemic, limited opportunities for group reflection activities.

Like KQED, members of the research team also had concerns about testing cycle time frames, and the order of specific testing experiments. One member of the research team suggested that KQED’s expectations for CTC research should not have been results leading to immediately actionable outcomes. From this perspective, the goal of CTC research was not to understand a phenomenon occurring at a moment in time, but rather to identify trends over a longer period, such as understanding a gender gap, or the level of engagement with digital science video. In adopting this approach, more actionable research could then be conducted once these longer trends were more fully understood.

I think the problem (with expectation of immediately actionable outcomes) is kind of what we saw, where in some cases the health emphasis titles seemed to work in other cases they didn’t. Ultimately we still didn’t really understand why. It’s going to take a longer time period with more data to really understand what’s happening in order to create that actionable research. — Research team

Despite these issues, a tremendous amount of research findings were generated over three years, substantively more than most NSF-funded projects. Some staff suggested that modification of the project design could be one way of addressing concerns over the issue of actionable research, and lack of adequate reflective time. By building a body of knowledge during the first half of the project, rather than continuing to conduct more tests over the full project duration, more time could have been spent reflecting on that knowledge and more carefully assessing how it could be applied to the work at KQED.

Collaboration involves not only conducting the research itself, but taking time to jointly process and contextualize. Rigorously reflecting not only on the application of research, but on experiences gained from actions and process can provide a strong basis for improved planning of future studies.

I think it comes back to maybe having more of that reflective time in general. We started off with an assumption that each of these groups (Deep Look and science news) had a hypothesis that they wanted to tackle. All of those questions were big questions. All those questions were really important to answer, but there wasn’t this ongoing conversation about how we could actually execute on what we learned. We didn’t have adequate time to process. — KQED science staff

--

--