Qualitative Data Handling and the Need for a Triangulated Approach

Much has been reported about the need for a wider reach and better impact in terms of UK research outcomes and dissemination. The reason is due to a renewed focus on the sort of impact that can influence public engagement and culture (Lord Nicholas Stern Review, 28 July 2016). Other key areas of infrastructure that play a crucial role for such engagement and cultural evolution involve media communications and government-organisation data handling.

Next, with these three areas in mind, let’s take a closer look at why the need for accuracy is necessary to enable an authentic approach if we are to influence public engagement. To be clear, when I refer to accuracy I mean the way in which information is managed e.g. sourced, handled, cross-checked and disseminated. Equally important is the order in which those actions are processed, as well as how, which can determine authenticity of information for a triangulated approach.

Perhaps the easiest way to describe what I mean by triangulation is when two people observe a car accident and re-tell events to a third party. Depending on whether they tell their story together, or separate, the stories might merge so one voice leads or two very different ways of seeing can emerge, respectively. However, if the driver is able to add his/her voice to the narrative, then we have a triangulated approach to consider in order to make a judgement as to what happened. This three-way type of communication is more valid than that of a two-way, cross-check particularly when the primary actor/source is included.

Often when top-down research providers financially support university research, dependent on domain, these bids are itemised as to how the work will be carried out and costed. The work is disseminated (I say this again-dependent on the domain) via end points that constitute a research paper and conference presentation. These two end-point, outcomes, act as a boundary to the specific workflow. Each piece of documentation; the paper and the presentation are considered as the researcher’s representational-primary sources that outline workflow, rationale and outcomes.

On a related note, this timely nature article highlights a need for research to adopt a bottom-up approach; one that stems from an identified purpose for impact ‘with’ the social milieu, and the inference is a call for movement away from research-driven agendas that rely solely on theoretical implications, which can often remain rhetorical. Meanwhile, in a recent blog post by Professor Mark Reed, Newcastle University, UK, there is a focus on innovation and he concurs with the need for effective communication to achieve wider impact.

By comparison, media communications also depend on sources to inform what and how events unfold. Publishing includes the source, handling that source and releasing information, but what is the role of cross-checking for accuracy? And how does this impact on authenticity? We know reporters interview one-to-one, use film/audio footage, or draw on a given data-set to relay a story. We also know by being in-situ, as a reporter, and including own voice can add weight to the authenticity of a story. Yet, there is a lack of triangulation if media communications rely on a minimum of two voices.

Let me explain.

Thinking back to the research paper and the presentation, say we have a research paper written by a group of university researchers and their presentation. It is a qualitative study, a smaller chunk from a big-data sample (numbers/quantitative study). So, the researchers are pretty made up about the findings; they have looked at interview data and coded the voices/input from participants, and lo and behold tallies nicely with the numbers. The project has numbers, participant voices, and researcher-agreement. Three elements for an authentic triangulated approach? Yes, providing members/stakeholders do not disagree with findings/accuracy and so on. Such member checks provide for triangulation as well as the use of triangulated methods across the study.

Turning to the media communications framework, we can short-cut to the main point in posing the question: When does the primary source data become second hand news? This is a simple way of asking: once the audio recording/film footage or whatever leaves its context, How valid is that primary source? And, more to the point, whose voices represent the journey across the story as well as those who seal the accuracy of the overall outcome for authenticity? I raise these questions pertaining to accuracy in view of the BBC as well as issues surrounding newspaper reports.

In sum, when thinking about university research as well as the mechanism involved in media reporting, which appear to show a reliance on a need for accuracy to aim for authenticity, it appears triangulation may be an enhancement for the future.

With that in mind, it is also interesting to reflect on governance. Particularly so, given we know the validity of online polls and follow-up interviews for predictive/current analysis etc, presents a highly questionable framework. I am referring to government party leadership polls/census data for example, which can involve systematic sampling errors, and sampling methods can lack a robust nature. A good example that reveals such potential chaos, can be found in this write up by the former President of YouGov, where there is even reference to ‘re-interviewed’!

To conclude and to confirm, I have drawn together three areas of infrastructure that have the potential to influence public engagement and culture. In so doing; I have presented the need for a triangulated approach when handling and disseminating qualitative data/outcomes, to enable accuracy and promote authenticity.