Cracking the Code: A Science Media- Research Collaboration

Learning about science media, journalism and the NSF

This article is one of a multipart series exploring the unique media practitioner-academic research collaboration of Cracking the Code: Influencing Millennial Science Engagement (CTC) a three year Advancing Informal STEM Learning Innovations (AISL) research project funded by the National Science Foundation (NSF) between KQED, a public media company serving the San Francisco Bay Area, Texas Tech and Yale universities. KQED has the largest science reporting unit in the West focusing on science news and features including their YouTube series Deep Look.

The author, Scott Burg, is a Senior Research Principal with Rockman et al.

Applying academic research to a media environment

Cracking the Code (CTC) was the first opportunity for many on the Texas Tech University research team to work directly with science media practitioners on a formal study, let alone collaborate on an NSF grant. Over the project’s three year duration, researchers learned firsthand the kinds of issues that science media content producers and news reporters experienced on a day-to-day basis. They gained a better understanding of how the newsroom worked and how science producers handled digital media.

The research team acknowledged how useful it was to hear directly from media professionals what they were interested in learning, and why. Coming from an environment where most of their information about media comes from published literature, the research team appreciated the advantages that direct exposure to the realities of producing science media can provide.

I think one thing that the research community in general could do is learn more from what the people who are on the ground actually want to know. It’s helpful to see where the literature is lacking in things that people who presumably use that literature need it for. — Research team member

The research team came into the project with twin concerns. One being that the media “communicators” would rely too much on the researchers (“just tell us what the answer is”), the other being that if challenged by research findings, the KQED team would not actually revise what they already believed. Over time, researchers realized that for the most part their concerns would not be realized.

My surprise has been that the kind of prodding necessary to overcome those two dynamics has been less effortful than I expected. People here have been in a sense very scientific about approaching these issues. It hasn’t been so hard to make people be reflective on competing accounts about the things that are going on. — Research team

The research team acknowledged that their research methods were a bit more formal and less time sensitive than KQED’s. They commented that having more exacting benchmarks (similar to production or news reporting timelines) was helpful in keeping them on schedule. Understanding KQED’s workflow demands was also instrumental in helping the research team recalibrate their own workflow activities.

Benchmarks we’re used to in academics are much more general. We are opportunistic and we do things that can be done when we can do them. There’s an academic leisure of greed, the sense of time is basically just indefinite. We know that’s not the case at KQED, and have adjusted accordingly. Now we know where we’re at if we get behind and we know why and I think that that’s fine. — Research team

Members of the research team noted that this project was more of a pure collaboration than others they had worked on, unlike projects where the media partner provides funding and simply wants the research team to provide answers.

While the nature of the collaboration was appreciated by the research team, it did create some issues early on in adapting to the needs and demands of a nonacademic setting.

I think the approach is great but one of the places where it really caused a lot of difficulty is the difference in working style. A lot of academics or researchers aren’t involved in a lot of applied or program evaluation. Not only is the time thing a little bit weird but just the idea of deliverables and having a constant schedule of things that need to be turned in, it’s just not the way most academics work. — Research team

Researchers developed a tremendous appreciation of the size and scope of KQED’s science operation, and the amount of time and effort necessary to produce even short segments such as Deep Look. They acknowledged developing a greater sensitivity to the complexity of scheduling, and the importance of more carefully aligning delivery of their own tasks (e.g., developing surveys, conducting data analysis) to meet KQED’s more exacting workflow demands.

They acknowledged the importance of face-to-face (or Zoom) meetings with KQED science staff as a way of establishing a more “connected engagement” with them. One researcher referred to this as instrumental in developing a “greater kinship in habits of thought” and enhancing the quality of working relationships between them and KQED.

Research team members took particular care to understand differences in terminology (between them and the KQED science staff), which if were not fully explored, could negatively impact expectations or take the research in a direction that might not meet KQED’s needs. During project meetings, researchers were always very careful to ask questions and solicit input on every facet of the study, no matter how minor. These in-depth inquiries helped to deepen both sides’ level of interest, learning and engagement.

I think the language (between practitioners and researchers) is different. KQED has provided everything I’ve asked for, for example trying to understand what ‘thumping’ they were referring to. I asked them to recommend videos and articles that would help me get up to speed on their work. I’ve just found them very engaged in the project, which is great to see as a researcher to know that practitioners really do want to understand the work that we’re engaging in. — Researcher

Research team members were surprised at the amount of detail the KQED team had when considering the use of specific words or phrases as part of survey development. As a result of this process, some researchers commented that they became more intentional in the language and types of scales that they used when developing instrumentation for other research studies.

As KQED staff responded to the multitude of external and internal changes impacting their work, the research team demonstrated an increased willingness to modify study design and methodology to more effectively address KQED interests and concerns. During project meetings in late summer 2020, there was a palpably heightened level of engagement and discussion between KQED staff and the research team regarding research design, instrumentation and content focus. The exchange of ideas between KQED and the research team became more authentically collaborative. The research team listened and embraced suggestions from the media practitioner perspective.

An NSF Primer

For some members of the research team this was their first time working on an NSF-funded project. Unlike KQED, who are able to provide backfill for their staff working on CTC, those in academia do not have the same luxury. It was a learning experience for some on the research team trying to integrate CTC activities with their day-to-day academic responsibilities.

I don’t get to carve out a section of my time to devote to this project on top of the rest of the stuff (teaching classes, publishing, conference presentations) I’m doing and that can be a bit challenging, so I have underestimated how much of my time I needed for this. — Research team

One researcher suggested the possibility of NSF allowing the allocation of funds in the project budget to “buy out” some of her teaching time, or negotiating with her dean to free up more time to work on the grant.

A number of researchers commented that the amount of time spent interacting with other team members on this NSF grant was more frequent than other funded research grants they’d worked on.

This is very different from other collaborations I’ve done with other researchers. Usually there’s just two or three of us, and sometimes we go two or three months without speaking with each other. It isn’t the case that we have the kind of consistent schedule interaction that we do with this one. I think it takes a bigger chunk of time than maybe our institutions are used to. — Research team

Despite the learning curve, the research team’s breadth of experience, sophistication and interest in conducting applied research through the CTC collaboration continued to grow. In the project’s second year, research staff assisted in the development of an NSF/AISL proposal with KQED to conduct further research with Deep Look staff.

It’s exciting that I have experience seeing how a grant works all the way from writing the proposal, through developing and refining research protocols, conducting data collection, analysis and report writing. Being able to work with media professionals too was great and seeing how we as researchers can work with people in the field, whether that is media or another nonprofit agency. — Research team

During the same time period, CTC’s co-PI and research lead of the Texas Tech team, Dr. Asheley Landrum solicited KQED involvement in the development of a Faculty Early Career Development (CAREER) grant proposal to NSF. Had the proposal been funded, Texas Tech would have been the lead institution.

A strengthened collaboration

Another particularly unique example of the growing trust the research team had in the practitioner perspective was the inclusion of both the CTC co-PI from KQED (Sue Ellen McCann) and the project evaluator (Scott Burg) in the interview process of final candidates for the Texas Tech’s research team’s postdoctoral assistant. Normally, an academic research position interview process is strictly limited to staff or faculty from that particular institution. With the permission of others serving on the interview committee, McCann and Burg had the opportunity to ask questions of each of the three candidates, as well as provide input on making the final selection.

I really wanted to bring Sue Ellen (co-PI) in and have her input (during the interview process), especially since this person’s going to become a member of our team. I trust her judgment a lot with people. I also thought it would be helpful for us and the interviewees to include individuals who have insight on the dynamic of the KQED-Texas Teach collaboration. — Texas Tech staff

The collaboration provided researchers an opportunity to experience how their work could impact media practice and inform future research.

My biggest takeaway has been understanding that our research work gets to an actual person in the field at some point. Hearing them (KQED) give updates about projects and events that they have coming up helped me visualize what I’m doing and what I’m researching, how that can benefit people and offer ideas for future research as well. — Research team

Everyone we spoke with on the research team was anxious to pursue future collaborations with media practitioners. They found the process educational and engaging. For many, the collaboration helped them to think about their own work in new and different ways. The collaboration also provided as many questions as it did answers. Regarding future collaborations, the researchers are still pondering how to address the pressing questions of science media from what they know from decades of academic study.

How do you walk that line between the theories that have been developed and how to test those in a real environment? I don’t think I’ve figured that out yet. — Research team.

These new insights helped the research team contextualize and focus study findings in a way that had more meaning and applicability to KQED. The experience also informed how these kinds of collaborations might be conducted in the future.

An important role of science communication is trying to help people better understand the science of issues that are extremely relevant to everyday life. Journalists and other types of media makers put that information out to the world. In our studies we’re not paying attention to what problems they actually have or what issues they’re actually seeing. Instead, we focus on doing these small tests within the context of theories that may or may not be generalizable or useful. Working with KQED has really helped me better understand some of the problems that they face. — Research team

--

--