About the evaluation

OTT
TPA landscape scan and evaluation
6 min readJun 28, 2021

The evaluation was undertaken by Southern Hemisphere with OTT Consulting to inform the Hewlett Foundation’s new five-year TPA grantmaking strategy. The lead evaluator was Nana Davies, Senior Consultant at Southern Hemisphere, a South Africa based consultancy specialising in participatory evaluations of development interventions with a focus on building learning organisations. Nana worked with a team of evaluators from Southern Hemisphere: Cathy Chames, Mark Abrahams, Danielle Lemmons, Brilliant Bhebe, Langton Moyo and Tracey Phillips.

Over the 2015–2020 strategic period, the Hewlett Foundation made over 370 grants to 107 grantees, at the international, regional and national level, in 10 countries and 14 thematic areas.

Balancing the ambition to explore and learn as much as we can, with what’s feasible within a reasonable time frame and budget is always the biggest challenge for an evaluation. Prioritising questions and focus areas inevitably involves difficult decisions and saying no to relevant and interesting lines of enquiry. But it’s an important and necessary part of the process.

Here we offer some insights into how these decisions were made, and the resulting scope of the evaluation.

Evaluation questions

We arrived at a final set of eight priority evaluation question areas and eight ‘nice to have’ question areas through a series of one-to-one and group meetings with the foundation’s transparency, participation and accountability (TPA) team, to get a better understanding of where there were gaps in knowledge, and where there was particular interest to learn more or deepen understanding.

We also invited comments from grantees via the TPA team’s Medium Blog, to find out what were interesting questions to others, whether we were duplicating knowledge that exists elsewhere, or whether we were missing something significant. We received a number of thoughtful comments that helped us to refine, nuance and deepen some of the questions and sub-questions.

Our final eight priority question areas were:

  1. Outcomes and contributions
  2. Knowledge production and use
  3. Enabling and constraining factors and risks
  4. Validity of assumptions
  5. Gender equality and social inclusion
  6. Support to grantees
  7. Strategic decision-making
  8. Spatial focus of grantmaking

You can see the full final list of guiding evaluation questions here.

Focus countries and themes

As much as we would have loved to speak to all 107 grantees, it wasn’t feasible and so we had to narrow our sample. To do this, we divided grants into three groups: international, regional and national and identified a number of focus countries and thematic areas.

At the national level, we reached out to grantees in three countries: Kenya, Senegal and Uganda. Why these three?

  • Firstly, these are three countries in which the foundation has disbursed a significant number of grants, or has been engaged for a long time, and so there is sufficient data for the evaluation to draw upon.
  • Secondly, none of these countries have been the subject of a stand-alone evaluation during the strategic period, and so we would not be duplicating efforts but rather generating new knowledge to contribute to our understanding.
  • Thirdly, we tried to pay attention to context, choosing countries that represent a cross-section of governance and political contexts, as well as both francophone and anglophone African countries.

For grantees working at the international and regional levels, we reached out to those working in nine focus areas were selected in consultation with the foundation as they represent areas where the TPA team wants to better understand the contribution its investment is making:

  • budget and expenditure transparency;
  • field learning;
  • legal empowerment;
  • media;
  • service delivery monitoring;
  • multi-theme (grants working across more than one thematic area);
  • natural resource governance/extractives;
  • public procurement/open contracting and
  • tax.

Our approach

Our starting point for exploring outcomes was the strategy’s theory of change, which we used to trace the intended changes forward through output, outcome and impact levels.

We called this an ‘inside-out approach’ — starting with the foundation’s sphere of control (foundation staff) and the sphere of influence (grantees) to ask what changes they intended to bring about, and the extent to which they think a particular project or programme contributed to any change.

We supplemented this with an ‘outside-in’ approach, starting with the foundation’s sphere of concern (grantee stakeholders, such as communities, government officials or other organisations) and its sphere of influence (grantees) to ask what has changed — if anything — and then working backwards to determine what factors contributed to these changes, including what role TPA grants (and ‘beyond the dollar support’) played in creating the conditions for outcomes to emerge (or not).

Taking this dual approach enabled us to:

  • interview the Hewlett Foundation and grantees/stakeholders in parallel, to incorporate a diversity of perspectives;
  • capture not only intended outcomes but also those that were unintended and unwanted, as well as areas in which the Hewlett Foundation’s TPA team and strategy has intervened but where outcomes have not emerged or are yet to emerge; and
  • make a robust assessment of what contribution the foundation’s TPA strategy has made to creating the conditions for outcomes to emerge (or not) in the TPA field over the past five years.

Evaluation methodology

All evaluation questions were explored using a combination of a document review and 92 semi-structured interviews or group interviews with stakeholders from across the Hewlett Foundation’s main stakeholder groups between November and December 2020.

We interviewed:

  • Sphere of control: staff from the immediate Hewlett TPA team, the Gender Equity and Governance program and other Hewlett staff (13 in total);
  • Sphere of influence: grantees (39) and peer and co-funders (10); and
  • Sphere of concern: grantee stakeholders including policymakers, journalists, civil society practitioners, academics and business people identified by grantees and Hewlett Foundation program officers (30).

The document review was used to draw out and analyse existing content provided by the Hewlett Foundation and its grantees. The evaluation team coded documents against the evaluation questions using Nvivo, a qualitative data analysis tool, and used this review both as a data source for the overall evaluation and to inform and prepare the evaluation team ahead of grantee interviews.

We sought to validate all evaluation findings, cross-referencing against documentation and checking statements or claims with external stakeholders — including parliamentarians and other decision-makers. These triangulated and validated findings form the basis of our analysis and the discussion in this report. Where we were less certain about a statement or claim but felt it valuable to include, we have made this clear.

A draft report formed the basis of the sensemaking phase of the evaluation. During this phase the evaluation team engaged the TPA team and other Hewlett Foundation staff as well as grantees to help shape the final evaluation report including filling critical information gaps.

Limitations

The evaluation team did not try to be comprehensive in assessing all possible grantee outcomes. Instead, we left it to grantees to identify their core outcomes and focused on verifying these with external stakeholders.

Although grantees were asked how did the outcomes differ for different groups (based on gender, class, age and ethnic groups) the analysis of the responses could only be grouped as how did grantees contributed to advancing gender equity and social inclusion in the governance space.

Disclaimer

Although some of the work described in this retrospective summary may reflect the passage of legislation, the Hewlett Foundation does not lobby or earmark its funds for prohibited lobbying activities, as defined in the federal tax laws. The foundation’s funding for policy work is limited to permissible forms of support only, such as general operating support grants that grantees can allocate at their discretion and project support grants for non-lobbying activities (e.g., public education and nonpartisan research).

--

--

OTT
TPA landscape scan and evaluation

OTT is a global consultancy and platform for change supporting better informed decision making.