Monitoring, Evaluation, and Learning (MEL) Matters: Building our MEL Technology Suite at Mercy Corps

Amanda Borquaye
Mercy Corps Technology for Development
10 min readDec 26, 2022

Written by Farah As’ad Haddad, Program Specialist (Jordan)

When we learn new skills, we can act boldly in addressing familiar problems from new perspectives. At Mercy Corps, an interdisciplinary team composed of IT, Technology for Development (T4D), and Monitoring, Evaluation, and Learning (MEL) departments, came together to analyze and understand the scale of the challenges facing program teams as they sought to utilize a digital ecosystem to advance their work. In 2021, these teams came together to document the learning acquired by Mercy Corps HQ and field offices in using MEL Technology. This work concluded with a White Paper on the state of MEL Tech in Mercy Corps, Program Technology Use Result, and a MEL Technology Suite that includes a complete range of options for MEL teams throughout the data lifecycle. Each tool is supported by Mercy Corps and sufficient with the policy requirements for data protection and security. With the generous support of the Cisco Technology for Impact partnership, Mercy Corps has developed and implemented a MEL Technology Training to build the capacity of selected learners.

Already, we have seen country offices celebrating the successes of these trainings, underscoring the importance of empowering team members to fully engage in the burgeoning digital ecosystem at the organization:

“This training boosted the overall performance of the MEL team in Haiti. Not only did I increase my skills in all the technologies, but it allowed me to hold discussions and brainstorming sessions with my team members and quickly identify solutions with several data collection tools. We also have become bolder with our data, piloting new questions, new indicators, new ideas.” Haiti MEL Manager

“The organized MEL Tech Training and its developed tool package is a complete set guaranteeing a coherent management of the data lifecycle by technological tools. This two-cohort training has revolutionized my ability to analyze and support the MEL teams of programs within my country by providing a full range of technological knowledge at each phase of the data cycle and a broad understanding of data protection and privacy principles.” DRC T4D coordinator

“By participating in this course, I gained knowledge and skills on various technologies that can be used to improve the program. I can’t wait to start sharing the new MEL Tech knowledge to MEL Team members in Indonesia, as well as other staff in MCI who are interested. I am thankful to all great trainers who have been patiently helping me and other participants to learn.” Indonesia MEL Manager

This article details the approach the MEL team took to developing the training, how it was delivered, and lessons learned.

MEL Tech Training structure

During 2022, Mercy Corps began to offer a systematic, competency-based MEL Technologies Training to a select group of staff who routinely use the MEL Tech Suite for collecting, analyzing, visualizing, reporting and promoting use of data in their work. In implementing the first ever global MEL Tech Training series, the team carefully considered who and how many students to include, and how to structure the trainings for maximum effect across time zones, while staying within the team’s capacity to deliver. In addition, careful attention was paid to how to evaluate the first training in order to make improvements in subsequent years.

The Students

The first MEL Tech Training cohort of 2022 started with 35 selected learners from 20 different Mercy Corps country offices who work at the program, country, and regional levels. These are the individuals who design, implement, and advise on MEL-related work using technology platforms in their respective country offices and programs, and are positioned to have significant impact on usage in their offices. Accepted students were required to have job responsibilities which included use of one or more of the covered technologies.

The 2022 Cohort included learners from these 20 countries.

Topics of The Training

The MEL Tech Training was developed to provide learnings, build capacity on effective use of the MEL Tech Suite, challenge learners to work together on practical problems, and fund opportunities for interesting projects where new skills can be utilized. In addition to the MEL technology platforms courses, the training offered a course that focused on data protection and privacy using Mercy Corps Responsible Data fundamentals guides and toolkits.

The MEL Tech Suite

The Training covered Ona, CommCare, Power BI, TolaData (an internal Mercy Corps software), Stata, MaxQDA, and QGIS. It did not cover Google Drive or Excel, and provided only a very brief introduction to Atlas.ti and R, due to time constraints.

Training Course Sequence

The MEL Tech Training was provided through three phases:

1) Self-paced basic skills testing: The learners went through a globally available set of self-paced e-learning courses which teach the specific skills in the 8 technologies in the MEL Tech Suite required to effectively perform the regular MEL duties in programs.

2) Facilitated, scenario-based group exercises: Following the self-paced learning of the existing materials, the cohort received a set of practical tasks that allowed them explore the various capabilities of each technology suite using datasets and examples from or relevant to Mercy Corps programs through facilitated sessions.

3) MEL Tech Jumpstart” projects pitched by learners and funded with small awards.

The first phase of the training consisted of Third-Party Courses (TPC) and Tests that were provided free of charge and could be completed at the learner’s own time. These tests ensure that the student has the necessary familiarity with basic data collection (i.e. form design) and data management (i.e. cleaning, organizing, and validating data) to undertake intermediate and more advanced tasks. The second phase of the training contained two, 2-week-long, cohort-based sessions that followed a hybrid modality of facilitator-led instruction with individual and/or group work on completing practical exercises. Learners were able to gauge their ability to use each tool in Mercy Corps-relevant scenarios.

Finally, learners were given the opportunity to pitch small projects that would improve their program or office’s MEL Tech setup or integration. Selected projects were provided with small grants to fund software purchases, technical support, and cascade trainings to teach skills to other country office staff. These grants enabled MEL Tech Track graduates to immediately put their new skills and learning to use in country office MEL Tech systems, and transfer skills to other colleagues in country offices through direct peer mentoring. These grants also allow portfolios to adopt MEL Tech recommendations systemically, across all programs, as opposed to waiting for new programs and budgets to come online over time.

Measurement

To ensure the success of the training and learn how to improve it, Mercy Corps developed a theory of change (TOC) and a set of indicators that were closely monitored and evaluated before, during and after the completion of the training.

Our Theory of Change

If Mercy Corps develops curricula and e-learning material that is open to its staff, and if targeted training is provided to core members who can apply the new knowledge in STAs, then pilot programs will showcase improved quality and utilization of data, leading to higher MEL Tech uptake and an increased MEL tech skills base for the organization.

The TOC was based on a number of assumptions that were tested throughout the program implementation. Some assumptions were:

  1. The learning modules and exercises are in line with cohort expectations.
  2. The learning modules are delivered in line with expected quality.
  3. The jumpstart is provided to programs committed to MEL Tech improvement.
  4. The jumpstart deployment is long enough to trigger programmatic-level change.

A mixed-methods approach was used to evaluate the training, with activities including:

  • Baseline survey;
  • Pause and reflect sessions; (during implementation)
  • Post-graduation survey;
  • Endline (3 month post training) — results pending;
  • Post-Jumpstart (qualitative only) — results pending.

Key results

Baseline and post graduation data was collected from the participants before and after the completion of the training sessions to learn more about the impact of the training on the learners. The data showed that all participants had agreed that they have learned new skills in the MEL Tech Training. In addition, around 90% reported that the overall quality of the training and resources was good and the resources of the course were useful and relevant. Moreover, 77% said that The structure of the modules was logical and easy to follow.

Lessons Learned

What we Learned in Planning

  1. Understand your learners as best as you can.

The team conducted a detailed assessment before the training, to understand learners’ location, time zones, positions, language, level of capacity, access to technology, and preferences for training modalities. While this information was helpful, it will always be hard for potential learners to give answers about how they would prefer to experience a training which they have not yet seen. In future rounds of training, Mercy Corps will be able to integrate its findings about how learners actually experienced the training (grouped by their time zones, native languages, pre-existing capacities, etc.) in order to better anticipate learner needs.

2. Pilot and test both training content and method of delivery before the start of the training.

It can be difficult to follow this best practice given the limitations of time, personnel, and money that exist across the humanitarian and development sector. However, Mercy Corps’ experience demonstrated that this was critical: Multiple content errors were caught in a testing round and fixed, and individual technical facilitators who were not able to conduct a “dry run” had more difficulty introducing and explaining their scenarios and exercises in clear ways.

3. Allow time during training delivery to review, evaluate and revise training materials and methods for delivery.

Making time for review, evaluation, and revision during the training allowed the team to adjust the delivery timing and course sequencing in response to challenges expressed by learners. For example, after the first two weeks of training sessions, during the planned break and evaluation period, learners shared that they found it difficult to cover multiple technologies in a single week, and then revisit them in the week after. They preferred all content related to a single technology to be covered in 2–3 consecutive days, and this schedule was implemented for the subsequent wave of sessions.

Now that the first training has been completed, the team will conduct a more thorough round of revisions to the training materials and the delivery methods, based on the learnings.

4. Get the buy in from the HQ senior management and country senior management as the training will take out of the country staff working hours.

The team conducted multiple outreach sessions to supervisors of potential learners in order to clarify what the expectations would be for learners, and advocate for the supervisors to make the time for their direct reports to attend the sessions and complete the work. It was important to make the case that the training could lead to specific, measurable improvements in their MEL data collection and analysis, and results from this first training will be helpful to make an even stronger case in subsequent years.

What we Learned in Facilitation and Way of Delivery

1. Give time and resources for learners to practice the new skills during the session and after to ensure the gain of knowledge and skills.

Videos of the sessions were specifically reported to be very helpful for learners to review afterward.

2. Include more interactive exercises and activities during the session, and try to include between 10–15 learners in the training sessions so the facilitator can follow with all of them.

Virtual training has well known challenges with participation and distraction. Having learner cohorts small enough for facilitators to assign specific interactive exercises, work with each individual learner to share their progress throughout the day, and troubleshoot problems, was important for ensuring that everyone stayed active.

3. Allow break days between each technology training so leaner won’t feel overwhelmed with the amount of new information.

Learners found it difficult to sustain what they had learned in a packed schedule, likely in part because despite blocking out time for the training, there were often still demands on the learners’ time from their programs and offices. Allowing break days helped for learners to catch up with learning they may have missed or not understood the first time, rather than falling increasingly behind.

4. Include more group work during the facilitated sessions so learners can learn from each other’s experiences.

One goal of the training was to encourage group discussion and support to complete the exercises, and in so doing help to establish a global peer group which could be sustained post-training. However, learners in the virtual environment struggled to have group interaction while completing individual exercises. More effort to assign group work, provide smaller virtual group spaces for collaboration, and better utilize technology that enables virtual collaboration will be important to correct this in future trainings.

--

--