Creating a Gamified Learning Measurement Tool — the why, the what & the how

Humanitarian Education Accelerator
HEA Learning Series
8 min readJul 22, 2021

A collaboration between the HEA, War Child Holland, NYU Global TIES for Children and Porticus

COVID-19 related school closures have caused significant disruption to classroom-based learning across the globe — leaving some one billion learners out of school and compounding existing challenges, particularly within the Education in Emergencies (EiE) sector.

While many Governments have turned to connected education and tech-driven programmes to support continued learning during school closures, there is little available evidence on the impact of distance learning on learning outcomes, especially for children affected by crisis and conflict and regarding holistic learning outcomes.

Understanding the impact of the growing number of digital interventions on learning outcomes for these children (both in and outside of the classroom) is crucial to ensuring that no child is left behind in the wake of COVID-19 disruptions. In order to be able to understand how to better support children’s learning, we first need to be able to understand what children know and are able to do. Armed with this information, teachers and educational programmes can then provide more tailored support. The assessment of learning outcomes is therefore key.

In response to a clear gap in this area, a new partnership, brokered by Porticus, brought together War Child Holland, NYU Global TIES for Children, and the HEA to interrogate the need for an effective tool to assess learning outcomes across digital learning interventions. Through a process of co-creation that draws upon the valuable skills and experience of each organisation — in gamified learning, learning measurement and innovation for refugee education — the team are working together to develop, test and pilot a gamified learning measurement tool that has the potential to measure both academic and social and emotional learning (SEL) skills.

About the tool

The aim of the project is to produce a prototype tool that can help teachers to assess children’s learning, either at home or through devices in the classroom, and provide necessary support to learners, based on the data generated.

The premise of the tool is that children can access it from anywhere, with gaming features embedded in the assessment to make the experience more motivating and rewarding for the child completing the assessment. A teacher dashboard “point of view” is also integrated into the design, allowing educators to see children’s performance level in real-time in order to adjust their instruction.

Through an iterative process of designing, testing, and piloting the tool we aim to share a final tool to the wider EiE community. Ultimately we hope this work will contribute to the improvement of holistic outcomes for children and adolescents and increase the accountability of the EiE sector.

Where are we at and what have we learned so far?

Whilst we are currently in the early phases of development of this project, there is already so much that we have learned.

Inception phase: During the inception phase, the team grappled with identifying the parameters and purpose of the tool. In order to support this process, we brought together a range of EiE stakeholders to share their reflections on what an effective learning measurement tool for digital-first learning programmes might look like. As a result of this workshop we were able to collaboratively define the purpose of the assessment in response to the needs in the field, as well as identifying the key elements to be featured in the tool to ensure it meets its purpose. This included agreeing that it should be:

  • User friendly — for both the children completing the assessment and the teachers accessing the data
  • Motivational — to incentivise children to complete it
  • A formative assessment — that prepares for remedial teaching
  • Provide a potential opportunity for children to learn whilst using the tool
  • Provide an option for working offline
  • Include capabilities for educators to create their own assessments as per their requirements

The workshop also supported the process of defining the key competencies that should be assessed within literacy, numeracy and social emotional learning (SEL). Following the workshop, the assessment parameters were defined by the project team, ready for the next phase.

Design phase: Taking a phased approach has allowed us to ensure that the tool is built in response to needs outlined by the sector, is informed by expert inputs and is realistic in its scope. During the design phase our focus has been to:

  1. Develop the tool’s main platform navigation, admin platform and the teachers’ dashboard, and build out the gamified test;
  2. Design individual assessments for each subject (maths, reading and SEL);
  3. Design the reward system/structure;
  4. Create a list of recommendations for the next phase.

A key lesson learned during this phase was that a design specialist must be on board from the very beginning. Waiting until the project is defined and then adding the design fails to acknowledge that design and content are intertwined when it comes to creating a gamified test.

Testing phase: The testing phase is one of the longest phases of the project. It will allow data to be collected on the user-experience in order to understand what works and doesn’t work. We are also working closely with War Child Jordan and specialists from the Jordanian Ministry of Education to organise the testing and check the cultural validity of the content within the tool.

In order to ensure a structured approach that allows the team to adapt and improve the tool in response to feedback from users, the testing is broken up into the following sub phases:

  • Alpha, Visual & SEL testing (June & July 2021)

Alpha Testing: with technical team to chase bugs and define clear structure.

Visual testing: with children in Jordan to ascertain their preferred themes, collect inputs on the creation of a narrative and test their visual knowledge of icons etc.

SEL testing: with children, teachers and caregivers in Jordan, to check wording and visuals of SEL items.

  • Beta testing (August & September 2021)

This will be the first test of the prototype with end-users, in Jordan

  • Pilot testing (October & November 2021)

This will be the most extensive phase of testing with end-users, in Jordan.

Lessons Learned

Treading new ground is not without its challenges. So far, each phase of the project has been a learning process, with a core focus of the team’s work being identifying and co-creating solutions.

Below we share 7 of the key lessons learned so far that might be of interest to others operating in this space:

  1. Implementing valid assessment tools is challenging

Assessment is an important part of the learning process as it can provide useful information to the student and the teacher. However, most existing assessments require children to be present in school to take a “paper and pencil” test, which does not work in the context of distance learning and COVID-19 school closures. Most assessments are also not child-friendly and may induce test-taking anxiety in children, which we know can hinder their performance. Finally, most available assessment tools do not provide real-time data on student performance that can be used for formative educational purposes.

During the inception phase we recognised that using an existing assessment, whereby questions and answers could be directly transferred into a gamified digital format, would not meet our needs. We had to define all questions and answers from scratch. Since it is a formative assessment, it was also clear that both correct and incorrect answers must be meaningful in order to provide teachers with the required information about the learner to adapt their teaching. As a result, the available answers must be chosen with great care.

This is why our collaboration with Ministry of Education experts in Jordan is so crucial, so that they can verify the content and provide inputs.

2. Innovative, collaborative projects take time

The nature of this project — with multiple partners, an ambitious brief and an innovative design — demands an iterative approach, which involves exploring different pathways and options and sometimes stepping back to revisit things in response to feedback. This process is invaluable but it takes time. This must be reflected in timelines, resourcing and expectations of those involved.

3. Multiple users = multiple needs

Developing the tool in a way that simultaneously serves the diverse needs of students and teachers imposes an additional layer of complexity to the design process. We recognised early on in the process that making the user-experience efficient yet responsive to the needs of those with limited digital literacy, whilst also establishing a balance between game design elements and assessment elements, was a key (albeit challenging) aspect of our work.

For teachers there are also challenges in terms of how they will interpret the data they get back from the assessment. We recognise that it is essential to include training for teachers and facilitators on how to use the data they receive from the tool, as well as how to action findings for remedial instruction that is responsive to the learners’ needs.

4. Creation & contextualisation of assessments within the tool by teachers and facilitators

When looking at the option of allowing teachers and facilitators to create and contextualise new assessments using the tool, it was clear that this needed to be as user friendly as possible, since most would not have any coding skills and may have low levels of digital literacy, as outlined above. Our solution has therefore been to create a user friendly Assessment Builder module that allows teachers and facilitators to drag and drop assessment components, making the process simple and easy.

5. Awareness of the value of holistic learning and SEL

The fact that our project steps beyond traditional learning outcomes to address holistic learning throws up a number of questions that we have had to spend time thinking through and will be ongoing throughout our design and testing phases. These include whether the teachers that will use the tool are aware of the value of an holistic approach. Therefore, will they use an SEL survey component of the tool and how can that be incentivised? And if they do use that component, how will they interpret the SEL data and what actions will they take in response to it?

Training for teachers, as outlined in point 3, is essential to ensure they can get the most out of the tool.

6. Involvement of parents/caregivers

For any remote assessment, where the child may be at home rather than in the more easily controlled classroom environment, the question of how much help parents and caregivers might give their child in completing the assessment must be considered. Limiting this factor is challenging and the team believe that a key aspect of addressing this risk is ensuring that the tool itself is as motivational as possible for the child, so that they develop a sense of ownership and are therefore more likely to complete it themselves.

7. Data protection

Data protection is an important component of child protection that was at the forefront of our minds in the development of the tool. Whilst challenging, it was crucial that we find a solution that respects GDPR requirements. We have therefore created a system whereby each student has a numeric code instead of their name and each teacher can only see their own students.

What next?

As this exciting project evolves, we anticipate many more learnings, on the process of co-creation as well as the technical requirements for developing an innovative tool of this kind.

Watch this space for further updates and learnings from the team in the coming months!

--

--

Humanitarian Education Accelerator
HEA Learning Series

Education Cannot Wait-funded programme, led by UNHCR, generating evidence, building evaluation capacity and guiding effective scaling of education innovations.