Cracking the Code: A Science Media — Research Collaboration

Looking at research through different lenses

This article is one of a multipart series exploring the unique media practitioner-academic research collaboration of Cracking the Code: Influencing Millennial Science Engagement (CTC) a three year Advancing Informal STEM Learning Innovations (AISL) research project funded by the National Science Foundation (NSF) between KQED, a public media company serving the San Francisco Bay Area, Texas Tech and Yale universities. KQED has the largest science reporting unit in the West focusing on science news and features including their YouTube series Deep Look.

The author, Scott Burg, is a Senior Research Principal with Rockman et al.

In Organizational Learning II, Chris Argyris and Donald Schön (1996) discuss the problematic aspects of practitioner–academic collaboration in a chapter entitled ‘Turning the Researcher–Practitioner Relationship on its Head’. They start by noting that academic research and practitioner inquiry operate from two different points of view. While both the researcher and practitioner are concerned with causal inference, the academic researcher wants to identify generalizable rules that lead to probable predictions. The development of such rules often requires experimental or quasi-experimental design. Complex quantitative and qualitative analytical techniques are often used to isolate key variables that influence outcomes. In an academic context where inquiry is valued in and of itself, research is often open ended, iterative, and ongoing.

The practitioner, on the other hand, is more often than not trying to solve a particular problem in a particular setting. General rules or laws rarely provide a useful guide to action. In science media and news settings, for instance, research (inquiry) is more than likely to be time-specific and valued only to the extent that it produces results that can be acted upon or put into practice.

Identifying differences

The Cracking the Code collaboration began with the shared goal of exploring how KQED could adapt and expand upon existing research to understand the role of science identity and curiosity in millennial engagement and interest in science media. Research hypothesis were to be tested through two of KQED’s primary science products, Science News and Deep Look (a natural history video series developed specifically for YouTube that focuses on the small world of biology). Perceptual and practical differences between KQED and the research team regarding both the definition of research, and the methods used to conduct research activities, influenced the project’s early collaboration efforts. In rethinking a research approach, both parties continually had to spend time ‘unpacking’ what the other meant.

While a general research plan had been outlined in the original Cracking the Code proposal, once the formal project commenced it became apparent to both KQED and the research team that much of what they proposed (length/duration of testing cycles, research focus areas, research methodologies) had to be revisited. This reassessment was driven in part by each group’s somewhat conflicting (and uniformed) assumptions about the others’ research methods, unfamiliarity in working in this type of collaboration, shifting research priorities, and organizational changes (e.g. restructuring) that had occurred during the 6–8 months period after the proposal was originally submitted.

At the outset of the project the research team stressed an incremental study approach. They believed that findings and methods applied in an iterative fashion would allow each study to build on the next. In this fashion, each study would provide a foundation and serve as a catalyst for determining what next to study.

I think the hopes and expectations coming into the project were were just a little bit different. I think there is a difference between market research (that KQED is more familiar with) and theoretically based academic research. The theoretically based academic research tries to understand the reason for problems. That process can take a longer time. Once you understand the reason that a problem exists, then you can start the next step that is designing interventions. Once you design the interventions, then you can test interventions in specific places. — Research team

KQED believed that each study had to be discreet and concrete. KQED’s organizational experience with design-based thinking methods influenced their sentiment that it was important to move quickly to get a full testing experience ‘under their belt’, and to learn from that experience in order to move the project forward. They anticipated research opportunities that would provide answers that could be immediately applied to their work.

I thought we were going to do some quick tests on social media and then come away with some quick answers like ‘Hey, this what you should do to reach this type of audience.’ — KQED staff

Researchers felt that KQED’s Science Engagement staff needed to align their methods of studying audiences more with those of social scientists. To the contrary, KQED’s Engagement staff felt that researchers were overlooking the potential of applying social media analytic methodologies (methods that they and the Deep Look team were very familiar with) to better understand audiences interests and preferences.

What is actionable research?

Differences in what constituted ‘actionable research’ occurred throughout the project. Research team members were more interested in identifying longer term trends that could be applied over time. They felt that the kinds of analytics KQED employed were more aligned with market research, rather than leading to a more insightful understanding of audience interests and trends.

One of the things that we wanted to do was be able to have actionable research. One of the things KQED wanted was to employ new methodologies for things that could enhance their ability to do on demand research, but that really is market research. That’s like AB testing headlines. The AB testing headlines isn’t going to give us enough information and it’s going to take a longer time period with more data to really understand what’s happening in order to create that actionable research. The goal in that case is not to have a new headline every time you’re trying something different, but conduct research that is driving the choices that you make. — Research team

Staff from both Science News and Deep Look were hoping that research would be more aligned with media practices, and that whatever findings came out of the research could be tested with audiences sooner rather than later. For instance, members of the Science News team were anxious to test findings on the use of graphics with audiences, as well as some the methods they developed internally to engage audiences.

We (news) produced material about how to wash hands and like we could have tested that. We could have looked at what we produced on masks and tested that. We didn’t actually produce that particular graphic and it seems like it was a really good graphic. If the graphic works, and it’s interactive, you have to actually engage with working it better or testing it against an explainer or something like that. Those were the kind of the ideas that we had originally been looking for in this grant. We received a lot of money to produce stuff. I want to test them. — KQED science news team

Finding common ground

By applying both quantitative survey methodologies and digital analytic methods on a series of complementary studies, over time members of both project teams came to appreciate and understand the costs and benefits of their respective approaches to research. Researchers acknowledged that science communication research is not an exact science, and that even findings that are statistically relevant in one study, may not be replicable in another. KQED’s engagement team also noted that social media analytics can be flawed, especially regarding uncertainty with data quality, and unstructured data specific to the data source.

I think Asheley (research co-PI) helped us recognize the limitations of the world that we (KQED) operated in, which is the stats that we see on social media. I think we kind of showed her our world how certain findings don’t always match up on the surveys. — KQED staff

In science communication research we’re not paying attention to what problems they (media practitioners) actually have or what issues they’re actually seeing. Instead, we focus on you know, doing these small tests within the context of theories that may or may not be generalizable or useful. — Research team

The teams also came to recognize that they had more in common than they thought.

I think the collegial part of it is we actually share a lot of common practices. While we didn’t necessarily understand each other’s language, nor do we understand and know the history of the research that’s been done, or past practices or past results, we understand that it is a process on both sides. I think journalism is very similar. It has a very similar process. You start with a question, you investigate what’s known about that particular question, you interview people, you evaluate, you write things up (or produce things using journalistic practices and you wind up with a product. There are ways in which the research ‘process’, is also a process in journalism. — KQED staff

What both teams learned that a necessary condition for good academic–practitioner collaboration the recognition that everyone needs how to learn how to learn together. It might also mean that in so far as research is concerned, recognizing that better information is not enough, and that who is consulted, and how information is collected, presented, and reviewed will strongly influence whether learning leads to any change.

--

--