Innovations in Granting: Open Peer Review and Participatory Judging

Changing How “Winners” Are Picked

This is the first of four posts in a series focusing on innovations during the grantmaking process. In addition to this series, others in this publication focus on innovations before and after the selection process. Head back to the table of contents for an at-a-glance look at the whole publication including our introduction on the importance of open and effective grantmaking innovations for improving the legitimacy and effectiveness of grant-based public investments.

Summary: In contrast to the traditional closed-door review process, many organizations have begun exploring new ways to make the judging or awarding of grants more collaborative. These more open judging processes can involve opportunities for public input at the outset to narrow a broad field or later on, to select final winners from a shortlist. This input could consist of public comments or voting, judging by panels of outside peer reviewers, or a combination of both.

During the selection phase, organizations are concerned with ensuring fairness, decreasing the costs and increasing the efficiency of grant administration. They want to recognize and select the most promising proposals. By bringing more people (and data) from more diverse backgrounds into the process at the selection stage, open grantmaking techniques have the potential to make the grant award process more informed and legitimate.

The philanthropic sector has led the way in exploring new ways to open up the process of judging grant applications to participants from outside the awarding entity. Beginning in 2007, the Case Foundation involved the public in every aspect of decision-making in connection with its Make it Your Own Awards from determining grant guidelines and judging criteria to voting. Although judges worked behind closed doors to winnow the 4,600 applications down to a top 20, the public was then invited to vote for the final winners. More than 15,000 people participated.

Participatory innovations like this offer grantmakers the opportunity both to make better decisions by broadening the sources of knowledge and expertise brought to bear, but also to build relationships with the communities they serve by involving them directly in the process. It is worth noting, however, that safeguards should be put in place to ensure that finalists do not lobby for votes — a concern held by many regarding such a system.

The Wikimedia Foundation, for example, with a grantmaking budget of over $2 million, integrates community input throughout the lifecycle of proposals and awards. As they explain, “In the same way that Wikipedia articles are born and grown on a public platform through the collaboration of a global community, so too are our grant proposals workshopped and reviewed on public wikis, as well as improved by volunteer editors.” Wikimedia’s model offers a powerful example of an “open peer review” alternative to traditional closed models of judging. A report by philanthropic consultancy The Lafayette Practice (commissioned by Wikimedia itself to evaluate its grantmaking practices and compare it to its peers) documented the growth in “Participatory Grantmaking Funds” (PGFs) more broadly, including the Disability Rights Fund, the HIV Young Leaders Fund, and FRIDA — The Young Feminist Fund. The report found that “PGFs serve as a powerful intermediary between grassroots organizing and traditional and institutional donors, functioning as a learning hub for institutional donors and participants.”

In the United Kingdom, the cooperatively-run Edge Fund has found success using participatory grantmaking specifically to bring marginalized communities directly into the grantmaking process. After receiving awards, grantees then have the opportunity to become part of the co-op, helping to reach out to potential applicants and eventually participate in future funding decisions. The Fund also invites other community members (beyond the grantees) to apply for membership in the co-op. As co-founder Sophie Pritchard writes, a unique advantage of the Edge Fund’s collaborative approach is that “members scoring applications [that affect] their own community… [give] guidance to the rest of the members” who weigh in later.

These alternative models in participatory grant assessment fall on a spectrum between the traditional closed judging approach and the wide-open wiki-based process. The White House Social Innovation Fund, another such example, outsources the awarding of grants for social innovators to handful of organizations with a successful track record for social innovation. By giving grants to the grantmakers, the Social Innovation Fund diversifies access to innovative proposals and applicants.

In an alternative version of this approach, The Other Foundation, a South Africa-based LGBT rights organization, used small teams of distributed peer reviewers — under the guidance of foundation board members — to vet applications and decide on awards in its inaugural year of grantmaking. The public nominated reviewers from across six countries to assess 114 pending funding applications. The organization then chose 12 peer reviewers, including academics, activists, health practitioners, and representatives from other nonprofits. As part of the process of conducting their evaluations, these peer reviewers had the chance to meet each other in person, agree on funding priorities, and develop a relevant theory of change. As the examples of the Case Foundation and The Other Foundation demonstrate, it is possible to combine closed with open and carefully curate the sources and channels of outside input.

Why Do It:

  • Smarter judging: Collaborative judging processes can bring to bear a wider range of knowledge and expertise, e.g. regarding what sorts of funded projects have or haven’t worked in the past.
  • Community: Open judging can also help funders build relationships with the communities they serve by involving them directly in the process.
  • Legitimacy: The transparency provided by a more open judging process can help build public confidence in the grantmaking body and assuage concerns about corruption, cronyism, or bias.
  • Skill-building: By giving the public the opportunity to weigh-in on grant opportunities, they stand to gain new knowledge not only about the issue addressed by the grant, but also regarding philanthropic decisionmaking processes.

Why Not Do It:

  • Confidentiality: Where confidentiality of applicants or their submission materials is an issue, more traditional judging processes may be more appropriate. However, grantmaking bodies can also pursue a “middle ground” with special safeguards relevant to these concerns, e.g. where applicants would know in advance the limited circle of peer reviews who, exclusively, would have access to their materials.
  • Timing: If a very fast turnaround is a priority, a wider circle of judges (be they busy peer reviewers or members of the crowd) may slow down the process excessively.
  • Popularity: Participatory judging results could be skewed in cases where a popular organization with a high level of name recognition is competing against smaller entities.

Click here for the next post in our series on innovations in judging and awarding grants, or head back to the table of contents for an at-a-glance look at the whole publication.