Stop buying tablets for schools
We need to stop thinking of EdTech as tablets, laptops or computer labs in schools. We need to stop thinking that an eLearning platform in a school makes that school “digitally savvy.”
There are too many moving, and untested solutions out there for us to make sweeping “best practice” statements about EdTech.
Consider the number of iterations and combinations of the below we should be trying:
A perfect combination of the above that works well in Manchester will not be the same combination that works well in Maralal. Best practices will differ geographically, from school to school, from teacher to teacher and from student to student.
Remember that a more accessible/easy to use solution is always better than a more sophisticated solution. This may sound counterintuitive, but I’ve seen countless eLearning platforms with advanced features that would be great if they didn’t intimidate teachers so much. An intimidated teacher locks EdTech up away from her students. And if students aren’t using the solution, then all the advanced features in the world don’t mean squat.
EdTech, like the process of learning, is a beautifully complex and personal process. There is no generic one-size-fits-all solution. Discover what works best for your context by thinking deeply about the various component possibilities above and testing different combinations.
Case Study
Very rarely does anyone try to tackle all the layers in this cake. In my career, I was fortunate to come close - delivering an EdTech literacy solution to children and youth in a refugee camp in the north of Kenya. The RFP by the Norwegian Refugee Council called for an EdTech scalable rapid response for education in emergencies.
The entire process was a testament to what amazing things can happen when true collaboration takes place and all stakeholders are on board to iterate, adjust and pivot.
First, joint consultations were held on:
- Where (geographically) the intervention could be tested
- Existing infrastructure and digital literacy there
- Who would be the best candidates for intervention
- What their learning needs and aspirations were
- What scale they aspired to reach
- What budget there was for infrastructure, hardware, software development, content, teacher training, and M&E.
The first rapid iteration of the intervention to be tested was the literacy methodology alone (offline version). Among the options were Reading To Learn, PRIMR, Tusome, Literacy Boost. Reading to Learn showed the most promise for this context.
The second important iteration was the inclusion of localized stories (content): so SIL, a language development and translation organization conducted linguistic fieldwork to write stories (in English and Somali) that were culturally and contextually relevant for the readers.
Once the stories were illustrated (by artists from all over Africa), they were again reviewed for cultural and contextual appropriateness. Surprising learnings at this stage included how many learners and teachers insisted that the cows “looked wrong” and had issues with what color clothes the children were wearing.
The stories and methodology were then translated onto a digital platform by eLimu called Hadithi Hadithi. The metrics and feedback gathered during implementation meant the stories were being adjusted and changing every few weeks. An HCD workshop was held with teachers to determine many of the user experience and interface aspects of the application.
BRCK loaded the platform onto the Kio Kits (content server and tablets). It was hard to iterate the hardware itself, but a major software change that was made midway through the program included giving teachers the ability to “lock” all the tablet screens when they felt the students were not “on the same page”.
The Norwegian Refugee Council installed a solar power solution at the centers to power the Kio Kits.
The Lutheran World Federation has deep experience with the schools in the refugee camp. They managed the teacher training, support, and mentorship. LWF were key in not only training the teachers on appropriate incorporation of technology in their lessons, but they also constantly consulted with the teachers on what changes needed to be made to the intervention’s design and delivery.
Results
There were large improvements in both English and Somali fluency, at both basic and higher levels. These included:
Basic level English in YEP centers: English letter sound fluency scores tripled (from 14.6 to 48.9) against zero improvement in the control group (a performance improvement of a factor of 3.3).
Higher level Somali in YEP centers: Reading fluency in Somali improved four times faster in project groups (by 20.2 points) than with control groups (4.9 points)
If you’re not familiar with literacy interventions and testing, these are comparatively huge gains!
The entire project, from inception to end results, shows what excellent results can be achieved from a truly collaborative design and delivery process.