As You Partner: Adapt Your Approach to Data

CASE at Duke
Scaling Pathways
Published in
6 min readDec 4, 2020
Photo by Gabriel Crismariu on Unsplash

Achieving impact at scale often involves various types of partnerships, whether it is a more tightly controlled relationship (e.g., contracting with a local organization to implement your program) or loosely controlled (e.g., providing resources to an organization to incorporate aspects of your methodology into its own work). Regardless, your approach to data must adapt as you pursue these partnerships, given an inevitably diminished level of control over what is collected, how it is collected, and the access you are allowed. One key scaling partnership that many social enterprises pursue is with government entities — in fact government partners came up most frequently in our interviews. Therefore while the advice below can apply to most any partnerships, the examples are primarily about government partnerships.

“The tendency in the measurement space globally is to collect lots of information and process information without knowing how it will be used. So one of Pratham’s goals is to understand what data is important and useful, and share with understanding with partners. — Devyani Pershad, Pratham’s Head of Program Management

Set a shared intention to use data for learning and improvement.

The risk: Many partners, particularly those providing funding or other resources, often default to using data for audit or accountability purposes — which can lead to a hesitancy to share data, and thus a stunted opportunity for data-driven improvements.

Hard-Won Advice:

  • Ensure all partners understand and agree upon which data is for accountability, and which is for learning and improvement.
  • Establish that partners will not be penalized for the data they share that is for learning and improvement.
  • Model a culture of learning, particularly from those who others perceive as holding the most power (e.g., funders, government).

Example: To continuously monitor and bolster the quality of community health worker programs, Last Mile Health co-designed the Implementation Fidelity initiative (IFI) with Liberia’s government at both the national and county levels. County health teams, NGOs, and national government representatives review the IFI data together quarterly as part of a learning agenda rather than as an audit mechanism that penalizes partners for ‘bad data’. Understandably, people are worried about being penalized for ‘bad data,’ since this type of data has traditionally been used as an audit mechanism. And when you’re sharing this data with the national government and donor partners, people of course fear having their funding and/or autonomy taken away,” shares Last Mile Health President and COO Lisha McCormick. In order to ensure that the IFI data was used for quality improvement to promote program improvements and collective knowledge sharing, Last Mile Health, the Liberia Ministry of Health, and donor partners “encouraged county health teams to showcase their challenges, as it would not be held as a mark against them,” as McCormick said. This agenda is reinforced at quarterly meetings, where county health teams are encouraged to share stories alongside numbers and where Ministry officials model a learning approach.

Align on what data to collect and how.

The risk: While a social enterprise will understandably create metrics that help drive its own work, those metrics may prove meaningless to partners and could get in the way of identifying a shared agenda. Additionally, any data collected without a partner’s engagement and buy-in may be viewed skeptically. [See also “Incentivize other Systems Actors to Collect and Use Data” in the systems change data article for tips on engaging partners in systems-level data collection.]

Hard-Won Advice:

  • Align with the metrics that matter to your partner, which may just be a different way of analyzing and communicating data that you already collect.

Example: One Acre Fund regularly speaks about its impact in terms of dollars—the incremental income gained by a farmer when engaging with 1AF’s model. But the organization recognized that agricultural ministries wanted to understand how work from partners contributed to the specific targets and objectives in their national strategies. In Rwanda, for example, the government wanted to know how many fruit trees were planted. In Kenya, the government was more interested in the total size of One Acre Fund’s farmer loan portfolio. One Acre Fund thus took on the responsibility to understand and provide government partners with the specific data relevant to their goals.

  • Be proactive about ensuring partner buy-in to external data so that they are willing to trust and use that data.

Example: Mo Adefeso-Olateju, Managing Director of The TEP Centre in Nigeria, spoke about beginning with the end in mind — knowing that the organization would want to use the data it was collecting to influence its government partners. TEP decided to engage an agency of the government, the National Bureau of Statistics, in developing the sampling methodology and in implementing the research in the field. When TEP went to share the results with other government entities within the country, there was little dissension on the results because they had confidence that a fellow agency had been part of the process.¹

  • Create shared data infrastructure elements, whether you build off of government data systems or ensure your own systems are able to connect with them.

Example: Last Mile Health supported the Liberia Ministry of Health to integrate community health into the country’s existing health management information system. MiracleFeet deliberately used common platforms (e.g., CommCare and Salesforce) to build its data technology because it was already proven, was developed for the unique contexts in which MiracleFeet works, and could connect to health management information systems being developed at the federal level in many countries.

“Demonstrate a commitment to the government’s vision. They need to see you as a development partner, and that what you’re doing matches their priorities.” — Colin Christensen, Global Policy Director, One Acre Fund

Promote streamlining and simplicity.

The risk: Each partner may have different expectations of and experience with the amount and types of data collected for use — often with a bias toward over-collection.

Hard-Won Advice:

  • Promote lean data practices that focus on timely, relevant, high-quality, simple measures.

Example: Pratham, which works to spread the gospel about “simple” (i.e., lean, relevant, and actionable) data, has faced challenges with some government teams who believe that “more is better” when it comes to data. When working through government systems to collect data, Pratham works to demonstrate the need for simplicity so that the data is collected quickly, is visible at all times, and is immediately actionable. Living Goods, which shares its data with the national health information systems, often receives pushback from government partners that its data is “incomplete” because it covers only pregnant women and children under five — but not other critical data on HIV, tuberculosis, diabetes, etc. Living Goods works to explain to government partners that collecting data for services beyond those which it provides would only distract the CHWs from providing their core services — but it is a continuing source of friction. Our interviewees shared numerous tips and considerations for homing in on only the most important data and simplifying collection processes so that they are highly repeatable.

Side note: Deep dive studies on partnership impact.
In addition to the simple, timely day-to-day performance data, many organizations need to undertake deep-dive studies to investigate the impact of implementation with and through partners. While Pratham is working on classroom-level data collection with expansion partners in new countries in Africa, it is concurrently considering pursuing discreet studies to better understand the impact of variations on key program elements, such as mentoring. Pershad explains, “While we work to ensure that simple data systems are built into expanded programs, we also need to ask — and collect data to answer — context- specific questions.” (Resources for “lean” and social-enterprise friendly deep dive studies include 60 Decibels and the Evaluation Toolkit from the Evidence Lab at the Duke Global Health Institute.)

Notes:
1. Scaling Up Community of Practice’s Education Working Group Webinar, October 10, 2019.

This article was written by Erin Worsham, Kimberly Langsam, and Ellen Martin, and released in June 2020.

--

--

CASE at Duke
Scaling Pathways

The Center for the Advancement of Social Entrepreneurship (CASE) at Duke University leads the authorship for the Scaling Pathways series.