Five factors that make for successful research in a pandemic
Simon Pickard, Portfolio Manager for our Health in Humanitarian Crises (R2HC) programme, shares key lessons about how COVID-19 rapid research was successfully conducted and what this means for researchers and funders.
Over these past few weeks, we’ve had the chance to highlight insights from our COVID-19 studies.
The research was conducted in very challenging circumstances. The changing nature of the pandemic meant teams had to react to sudden restrictions, limiting access to study populations. They had to adapt to the remote ways of working, often operating in areas with limited connectivity. The programmes which research was investigating often changed at short notice, creating unexpected confounding factors.
Despite these emerging challenges, a clear majority of the studies succeeded in addressing all or most of their research objectives. Many were able to add new elements to their investigations. The Norwegian Refugee Council-led team were able to conduct extra surveys to explore issues of vaccine hesitancy among older refugee populations in Lebanon. Some studies have been able to influence COVID-19 response policies, like the Makerere University-led team, whilst others have been publicised in widely-shared reports such as this article about the pandemic in Gaza.
Five Key Lessons
As our COVID-19 cohort’s work comes to an end, we’ve reflected on some key lessons about how this rapid research was successfully conducted and what this means for researchers and funders.
- Give it extra time, err on the side of realism rather than optimism
An inescapable observation from our COVID-19 cohort was that all studies took longer than originally expected. Our early hopes were that research would be concluded in six months with findings made available before the end of 2020. In reality, studies usually took an additional three to six months and, in a few cases, substantially longer. Fortunately, findings remained relevant because the emergency response continued through 2021 and into 2022. Many of the reasons for delays were predictable whilst others, such as newly introduced lockdowns, were more direct consequences of the pandemic. Unsurprisingly, modest study designs lend themselves to rapid evidence production. Surveys to understand knowledge and behaviours tended to produce timely insights in the relatively early stages of the pandemic. By contrast, studies that sought to create a tool and test its effectiveness often ran into many more practical difficulties. During the review of proposals our Funding Committee occasionally judged the proposed timeline for research to be overly ambitious, allowing us to extend the length of studies and associated budget from the outset. In future we could act more pre-emptively like this, erring more on the side of realism rather than optimism.
2. Respond to emerging need, adapt, and iterate plans
If data collection and analysis takes longer, study teams need to consider how they can build greater flexibility into their research plans. We saw examples where research objectives were adapted, or added to, as the course of the pandemic altered what information was most needed by the humanitarian responders. As funders, we allowed flexibility when study teams asked to adapt their objectives or methods.
For future funding calls we could build an expectation that research questions might develop iteratively across the course of the grant, and place more emphasis on us identifying well-positioned study teams.
This is especially relevant for rapid research calls which give applicants just a few weeks to develop their proposals.
3. Making rapid evidence available along the way
The delays we witness in data collection highlights the importance of study teams sharing new data as they go, rather than holding it all back until final analysis is complete and articles are published.
This was fundamental for this group of studies: multiple teams shared data in meetings, briefing notes and presentations at regular intervals. This required teams to have identified and be well-connected with their intended audiences from the outset. For subsequent funding calls we have emphasised the need for teams to be ‘positioned for impact’ — to already be well-networked with intended users of research findings — and made this a key evaluation criteria.
4. Operational partners are often over-stretched
As well as these relationships with evidence users, the operational partners need to have the capacity themselves to engage with research. Many study teams found that partners were much less available than originally thought, either to contribute to studies (as Key Informant Interviewees, for example) or to engage with the findings. We were able to support with the latter through our research impact resources and guidance: providing suggestions for different types of products that could more easily be consumed, or facilitating joint engagement events that took less of the operational partners’ overall time.
5. Adapting approaches to data collection
The pandemic brought about specific challenges for research teams. Periods of inaccessibility of study populations required flexibility in data collection. Surveys needed to be moved online or via phone, often creating practical challenges with contact lists of identifying appropriate mobile providers, and shortened to retain the attention of participants. Teams which could engage multiple data sources helped reduce reliance on interviews with busy operational partners. Data collection designed to take place across multiple phases allowed responses from an earlier phase to influence questions later on, such as asking about unforeseen issues including COVID-19 scepticism.
The ability of research teams to respond to changed circumstances, adapt study designs and regularly engage their audiences were key success factors during the pandemic. As research funders, we can play an important role in anticipating this need for flexibility when designing calls and managing grants.
We’ve already reflected on many of these lessons when designing our new research calls. We’re placing more emphasis on study teams who have a record of working together and who can build on an existing research agenda. The study teams should already know the evidence needs of operational stakeholders, and be ‘positioned for impact’ with the target audiences through their existing relationships. Understanding the contexts in which data collection will take place, including the needs of communities affected by humanitarian crises, is critical too.
Explore our latest tools, guidance and research on COVID-19.