Ripple Effects Mapping­

--

A Collaborative Process for Evaluating Community-Engaged Learning Programs

By Heidi Mouillesseaux-Kunzman

A ripple effects map for the Seed to Supper program

Have you ever “just known” that a project is positively impacting participants, but struggled to capture and document the depth, breadth, complexity, and richness of those impacts? If so, you are not alone. It was the want for an evaluation method or tool that could capture the outcomes of our work with students and communities in more systematic and meaningful ways that led faculty in Cornell’s Department of Global Development and Cornell Garden-based Learning to offer Cornell colleagues an opportunity to learn about an emergent (and exciting!) evaluation methodology.

Ripple Effects Mapping (REM) is an evaluation methodology created by Community Development Extension professionals looking for a rigorous way to better understand the impacts of community development programming and initiatives “within complex, real world settings.”[1] As many of us know, the most important outcomes of these types of projects can be challenging to assess because they often emerge after the project has ended or are indirect, resulting from intersecting short and intermediate outcomes.

REM is ideally suited to identifying the impacts of community-engaged learning and research efforts because it is, at heart, a participatory methodology.

Consider the challenge of capturing the outcomes of a course focused on community development principles and practice, which is taught through project-based learning — let’s say, for example, one in which students work with NGO staff, elected officials, agency personnel, and citizens to address food insecurity in a neighborhood. Over successive semesters, students may assist with community conversations and baseline data to understand the issues; identify, assess, and pilot proposed solutions; help address challenges in the implementation of specific strategies; and evaluate success. One can imagine such a scenario leading to a myriad of outcomes beyond the stated learning objectives and promised deliverables for students, community members, the faculty lead, and even the university or food security practitioners more generally.

While evaluation tools commonly focus on whether a project’s core, stated objectives were achieved, they don’t often address longer-term outcomes and unplanned impacts. Moreover, traditional evaluation tools (e.g., course evaluation surveys) often focus on an individual’s experience. They miss the opportunity to understand the collective or community experience that is better articulated through a process where members consider and reflect on outcomes in a shared and even collaborative space.

We knew we weren’t the only ones looking for better evaluation tools when colleagues from the Bronfenbrenner Center for Translational Research, Cornell Cooperative Extension, and the Office of Engagement Initiatives (OEI) eagerly agreed to help us organize a Cornell REM workshop and OEI awarded us an Engaged Opportunity Grant to support it.

How REM Works: A Participatory Approach

REM is ideally suited to identifying the impacts of community-engaged learning and research (CELR) efforts because it is, at heart, a participatory methodology. In alignment with CELR’s commitment to inclusion, diversity, equity, and transparency in university-community partnerships, REM engages a representative group of project participants in identifying outcomes, analyzing that data, and discerning key takeaways based on that analysis. REM engages participants through a systematic process which combines appreciative inquiry, paired interviews, group reflection, and radiant thinking, to literally map the “chain of effects” resulting from a program.[2]

REM engages participants through a systematic process which combines appreciative inquiry, paired interviews, group reflection, and radiant thinking, to literally map the “chain of effects” resulting from a program.[3]

As such, the REM process stands in contrast to evaluation methodologies where project leads identify the impacts to be assessed, analyze the results of the assessments, and assert conclusions based on that analysis with little, if any, input or guidance from participants. Because REM is designed to draw out outcomes of value to participants, it highlights expected and unexpected positive results. Importantly, it can also highlight the absence of expected outcomes, signifying that a project may not be fulfilling intended goals, or that those goals are perhaps not as important to participants as expected. By including participants directly in the evaluative process, moreover, REM provides a forum for participants and project leads to collaboratively refine and strengthen programs both during and after the REM process.

To more fully appreciate how REM is particularly well-suited to understanding CELR outcomes, it is helpful to unpack two of its core components — appreciative inquiry and radiant thinking — and how they are utilized in the REM process.

Appreciative inquiry is an approach to system change (e.g., in communities, organizations) that is designed to help uncover the system’s strengths, positive attributes, and potential as a means to fully achieving that potential. Appreciative inquiry is based on the principle that projects that start from a position of strength yield better outcomes than those that start from a point of weakness. As such, appreciative inquiry stands in contrast to evaluation approaches that focus on weaknesses and ways to overcome them.

Radiant thinking “refers to the brain’s associative thought processes that derive from a central point and form links between integrated concepts” and is based on the premise that the best ideas and most creative solutions emerge when the free flow of thought and opportunities to connect ideas is encouraged.[4] REM utilizes radiant thinking to augment individual feedback with insights from others who are invited to add to or elaborate on the outcomes shared by other participants in the process. In this way, REM has the potential to generate a richer, more nuanced understanding of impact than can be gleaned through evaluation efforts that simply solicit and aggregate individual perspectives.

In the REM process, facilitators use appreciative inquiry to guide project participants through a process of identifying positive outcomes, the activities that led to these outcomes, and what both might mean for the program going forward. Participants are given prompts designed to help them reflect on specific benefits resulting from the project/program being evaluated, and are then asked to share a story of those benefits with a partner through paired interviews. Participants then reconvene as a group, and each is invited to summarize their story, which is mapped to visually represent the positive ways the program activities have resulted in a chain of successive outcomes (the ripples).

As each story is captured on the map other participants are invited to add to the stories, either through additional details that reflect more ripples resulting from a particular project activity or through related outcomes resulting from another activity. In this way, mapping is used to show links between program components that may not otherwise be readily discerned. Program participants are then involved in a facilitated process of analysis, through which they collaboratively consider the map and identify key takeaways, including program strengths, opportunities to further related goals, and a variety of other potential revelations. REM practitioners have found this process particularly helpful for understanding expected and unexpected program outcomes, as well as for inspiring and energizing program participants, motivating them to continue their work and generate new initiatives.

Bringing REM to Cornell

If one were to map the ripple effect of the 2016 REM workshop itself, the map would be dense. More than 60 people, including Cornell faculty and staff, Extension professionals, and community partners, participated in the two-day workshop and interest has only grown since then.

Through our practice of REM at Cornell, we’ve come to appreciate the many benefits it has to offer and have learned that realizing these benefits just takes doing it!

During the workshop, Lori Higgins (University of Idaho) and Debra Hansen (Washington State University Extension), part of the team of REM creators, introduced REM through a brief presentation and then guided participants through the process as they used it to map and understand outcomes of Cornell’s Seed to Supper and, later, the CYFAR (Children, Youth, and Families at Risk) programs. Each demonstration was followed by time for questions, during which members of the audience delved into the nuances of this participatory evaluation methodology.

Following the workshop, a small group of staff members from Student & Campus Life, the Cornell Public Service Center, Cornell Garden-based Learning, the Community and Regional Development Institute (CaRDI) (now part of the Department of Global Development), and the Office of Engagement Initiatives formed a REM leadership team. Members of the team are eager to hone their own REM skills; share this methodology on campus, within the Extension System, and in communities; build capacity for using it; and further its development through theory and practice.

In the year and a half since REM debuted at Cornell, members of the REM leadership team, working with a variety of partners, have offered more than 10 workshops to learn about and practice REM and led REM sessions with campus and community partners as part of their evaluation efforts. In addition to the programs noted above, the following units and programs are using REM, or plan to do so, to more fully capture and document the outcomes of their work with students, faculty, and community members: Arts & Sciences Academic Advising, EARS, Recreational Services, Inner Life of Teaching and Leadership, and Rust to Green.

Through our practice of REM at Cornell, we’ve come to appreciate the many benefits it has to offer and have learned that realizing these benefits just takes doing it!

Resources and Getting Started

Looking to learn more so you can get started? You can skim the surface with “Ripple Effect Mapping: A ‘Radiant’ Way to Capture Program Impacts” or go for a deep (but easy) dive into “A Field Guide to Ripple Effects Mapping.” If you prefer video to articles or handbooks, consider eXtension’s Enhancing Rural Community Capacity Community of Practice’s webinar on “Using Ripple Effects Mapping to Determine Your Program Outcomes.

Here at Cornell, the REM leadership team welcomes you to join the REM listserv by emailing Leslie Meyerhoff (mls73@cornell.edu). The list is used to share opportunities to develop, practice, and refine REM skills, both through workshops and by helping others who want to REM their programs. Most importantly, get started! You’ll love it!

Heidi Mouillesseaux-Kunzman

— Heidi Mouillesseaux-Kunzman is a senior extension associate and the education minor coordinator in the Department of Global Development of Cornell University’s College of Agriculture and Life Sciences.

She wishes to thank members of the Cornell REM Leadership Team — Marcia Eames-Sheavly, Leslie Meyerhoff, Anna Sims Bartel, and Amanda Wittman — who shared helpful feedback on early drafts of this piece and who, along with others, are appreciated companions in the REM journey.

[1] Hansen, Debra A. “Part I: The Origins of Ripple Effects Mapping.” Eds., Scott Chazdon, Mary Emery, Debra Hansen, Lori Higgins, and Rebecca Sero. A Field Guide to Ripple Effects Mapping. Minneapolis, MN: University of Minnesota Libraries Publishing. University Digital Conservancy, 2018. 1. Web. 23 July 2019.

[2] Bhattacharyya, Rani, Templin, Elizabeth, Messer, Cynthia, and Chazdon, Scott. “Participatory Evaluation and Learning: A Case Example Involving Ripple Effects Mapping of a Tourism Assessment Program.” Journal of Extension. Extension Journal, Inc. 55 (2), April 2017. Web. 24 July 2019

[3] Bartel, A. and Wittman, A. “Finally! A Useful Methodology for Understanding Community Impact.” Conference Presentation. IARSCLE Annual Conference: Albuquerque, NM. 2019.

[4] Chazdon, Scott. and Langan, Samantha. “Part II: The Core Ingredients of Ripple Effects Mapping,” Eds., Scott Chazdon, Mary Emery, Debra Hansen, Lori Higgins, and Rebecca Sero. A Field Guide to Ripple Effects Mapping. Minneapolis, MN: University of Minnesota Libraries Publishing. University Digital Conservancy, 2018. 1. Web. 7 July 2019.

--

--

David M. Einhorn Center for Community Engagement
The Ripple Effect

The Einhorn Center for Community Engagement at Cornell supports a university culture where Cornellians and partners work together to create a better world.