Is A Learning Log Right For You?

Innovation Network
InnovationNetwork
Published in
8 min readMay 30, 2024

By Rebecca Perlmutter and Cory Georgopoulos, Innovation Network Senior Associates

As evaluators, we’re always seeking to incorporate methods that help us collect data in comprehensive and equitable ways. In recent years, we have found learning logs to be critical tools for enhancing our evaluation work. Based on Emergent Learning principles, learning logs are used to capture insights and events in real-time by recording an initiative’s challenges, experiments, and successes (including factors of success). Learning logs allow teams to identify themes across a particular initiative or scope of work, and to better understand that project’s full trajectory. Teams can also revisit their learning log as they reflect on their work, grounding their conversations in the concrete stories and data they’ve gathered.

Sample learning log template

In October 2023, we presented our experience using learning logs for long-term feedback and short-cycle reflection at the American Evaluation Association’s annual conference. Our presentation focused on our recent evaluation work with two of our clients, whom we’ll refer to here as Project A and Project B. In both projects, we employed learning logs as an experiment and as an opportunity to collect rich qualitative data. We’ve distilled some of our key takeaways into this blog to shed more light on learning logs as a useful evaluation tool, and to share important considerations for when and how to incorporate them into your own work.

Project A

Our Project A client is a community of practice of advocates who share policy goals, build capacity, and develop a mutual commitment towards advancing racial equity within their field. Innovation Network partnered with Project A to help improve its cohort facilitation, identify how it could build shared power among its organizations, and capture the learning interests of its participants. To do so, we conducted five monthly after-action reflection meetings with Project A’s facilitation team and used a learning log to collect data from these meetings. Prior to each session, Project A’s three facilitators populated their own learning log (using the digital collaboration tool Mural) with the successes, challenges, and experiments they had experienced since the previous month. Our team would reference their Mural to calibrate our own facilitation and follow-up questions. We discussed the same set of questions with the facilitation team every month and took notes. We then pulled out significant insights and events from our notes and added them to our learning log. We were also able to use the learning log for rapid reflection with Project A’s facilitators. After each session, we summarized and distilled our learning log entries and shared them with the facilitation team. They would then revisit this summary before our next session.

Key takeaways

Our role as external evaluators added an extra layer to this project. Since we maintained the learning log and only met with the Project A facilitators monthly, there was often a lag with the data we collected. Sometimes a lot would happen between our calls — and sometimes not much would, which was a different challenge! Our process was time-intensive for Project A’s small group of facilitators, which made us realize that this approach works best with fewer people in general. Their facilitation team didn’t always have the time to pre-populate their learning log or review previous entries, and at times, it would also take our team a while to fill out the log. Additionally, because the data we collected was filtered through Project A’s facilitators for the sake of logistical ease, it was still one step removed from participants in the community of practice.

Despite these challenges, Project A’s facilitators enjoyed having these conversations because they provided a dedicated opportunity to step back and reflect on their work. We weren’t just asking them to recount their recent actions — we also wanted to know their thought processes and the conditions that influenced their decisions. Over time, we got better at interpreting what they meant, and we captured different perspectives depending on the proximity of our sessions to one of their events. Having the team’s thoughts ahead of time via their Mural board was also very useful for facilitating our sessions and allowing them time to process. By conducting these regular sessions together, we were able to build a strong relationship with Project A’s facilitators and ultimately collect comprehensive data.

Project B

Our Project B client is a national nonprofit that works with policy advocates at the state level. Project B invited Innovation Network to conduct an evaluation on their efforts to align their internal processes with their new theory of change (TOC), an effort to shift their role in the advocacy space from focusing on policy wins to creating environments that allow for larger, transformational change. We conducted our evaluation in two phases: The first focused on Project B’s experience operationalizing their new TOC, and the second (which is still ongoing) is evaluating the outcomes of this transition. We met regularly with the Project B team in “reflection moment” conversations, which were designed to capture their insights and assumptions about aligning to the TOC. Similar to our work with Project A, we had Project B staff populate a Mural with responses to our reflection questions, which we used to inform each session. Within the sessions, Project B staff reflected on how they were adapting, what they hoped to achieve, and what was and wasn’t working as they experimented with changes to become more equitable and aligned to their TOC. We captured their reflections in a learning log after each session, and shared these learnings back with their staff in summarized versions.

Key takeaways

The amount of time required to populate the learning log was a challenge in our work with Project B, just as it was with Project A. We gathered a lot of insights during our reflections, so we had to decide what was “important” to include. We also spent quite a bit of time synthesizing our findings from the learning log to create summaries for the Project B team, although they did not always have time to review these summarized versions prior to the next session.

Ultimately, the Project B team said they learned a lot through our reflection conversations, and they were able to apply their learnings to their immediate work aligning to the TOC. We also used the analysis of our learning log from Phase 1 to map the learning areas that the Project B team is interested in focusing on during Phase 2 of our evaluation, as well as to document hypotheses about the outcomes the Project B team expects to see as a result of their changes around the TOC.

Is a learning log right for you?

We hope the above examples from our work have given you better insight into the practical application of learning logs in evaluations. If you’re considering employing this tool yourself, we have some recommendations for how to effectively incorporate a learning log into your evaluation work:

Establish a clear vision

Due to the sheer amount of data you’ll collect within your evaluation, it’s important to know what you want to get out of your learning log before you start making entries. Consider asking yourself: What is the question I’m trying to answer? Determining the purpose of your learning log will help keep you on track during your sessions and inform your facilitation.

Narrow your focus

Our work with Project A demonstrated that a learning log works well for gathering a lot of detailed information on one topic. If you’re covering numerous topics — as we did in our evaluation for Project B — we recommend keeping your analysis focused on the specific areas of work. These areas will inform your reflection questions and help you facilitate conversations that speak to the question you’re trying to answer.

Allow yourself enough time

As evaluators, we recognize time is a precious resource. However, in order to elicit useful data from a learning log, it’s important to allot enough time — for both yourself and your participants. Avoid employing a learning log if you’re pressed for time within your own schedule, or if it’s difficult to establish regular meetings with your participants. In addition to contributing to your learning log, you’ll want ample time to synthesize emergent themes and share your findings with your participants along the way. Your participants will also benefit from having enough time to reflect on your questions in advance, contribute to their own learning log, and digest the learnings you share with them.

Be consistent

One of our key takeaways across both projects was understanding the importance of consistent facilitators. Assigning the same team members to maintain the learning logs and facilitate reflections meant that we were better able to keep track of salient insights, identify big-picture themes, and use our past knowledge to shape the direction of future sessions. While you may not be able to keep your reflection participants consistent, maintaining the consistency of your own evaluation team will ensure you understand the full breadth of your learning log findings.

In our work with Project B, our participants did often change from session to session. As a result, we’re curious how maintaining consistent questions across all of our reflection sessions — as we did with Project A — would have impacted the kind of data we received. It’s worth considering whether you should strive for consistency within your facilitation questions if you know your participants will change throughout the course of your evaluation.

Be adaptable

While collecting this data in your learning log will allow for significant insights, it may also be overwhelming for your participants. Be prepared to adjust your approach to accommodate their needs and maintain their buy-in throughout the process. Throughout the course of your evaluation, you may also realize that the data that emerges from your learning log is different from the assumptions you made at the beginning of the project. For example, we pre-coded the data within our Project B learning log based on our ideas about which topics would rise to the top of our conversations. Along the way, we realized these codes weren’t necessarily accurate or relevant, and we had to re-do our coding system to ensure a more thorough analysis. While it’s important to have a system in mind for how you’ll distill and summarize your learning log data, be mindful that your specific approaches may need to change based on what you learn.

Incorporate checks and balances

Using a learning log as an evaluation tool often means that, as the evaluator, you will be the one interpreting all the data you’ve collected. Throughout the process of your evaluation, consider incorporating opportunities to garner your participants’ feedback on your findings. Do they agree with your interpretation? Why or why not? Incorporating regular checks and balances to your interpretations will not only ensure your participants feel heard, it will also help your findings be as thorough and accurate as possible.

Final thoughts

Ultimately, using learning logs within our work at Innovation Network has enabled us to garner extensive data for our evaluations. By facilitating and capturing reflective discussions with our clients, we created touchpoints that allowed them to reflect on their work and stay involved in our evaluations. In this way, we were also able to establish stronger relationships with our clients, and as a result, to identify more useful themes and learnings they could then apply to their work. Learning logs may require a commitment to maintain, but the payoff is often worth it.

Have you used a learning log in your evaluations before? We’d love to learn about your experience in the comments below as we continue to incorporate this tool into our own evaluation practices. Interested in experimenting with a learning log for yourself? Download our sample template to get started.

--

--

Innovation Network
InnovationNetwork

Measure results. Make informed decisions. Create lasting change.