Re-evaluating EGRA in Laos

Pencils of Promise
Transparency Talk
Published in
5 min readMay 4, 2018

By: Jonathan Tan, Impact Manager, Data Analytics & Visualization

In program evaluation, using the right measurement tool is critical. For example, you wouldn’t measure the distance between cities with a tape measure. Nor would you measure your height using a GPS. There has to be an alignment between what you’re measuring and what your chosen tool was designed to measure.

So it is with measuring literacy. At Pencils of Promise (PoP), we’re big fans of the Early Grade Reading Assessment (EGRA), a USAID-developed tool that we use to assess the development of key literacy skills of our students. The assessment is divided into three distinct literacy skill levels: Phonemic Awareness, Passage Reading Fluency and Passage Comprehension. Thus far, we’ve used the EGRA in evaluating our Teacher Support programs in all three of our partner countries: Ghana, Guatemala and Laos. This might be perfectly adequate if the linguistic context and the range of skills we expect to see are similar across all three countries. But they aren’t.

In a blogpost from last December, Christopher Stanfill, our Director of Learning & Evaluation, wrote about the importance of internalizing the local context in designing a literacy program that is culturally relevant and beneficiary-centered. And in talking about our results from the 2016–2017 Teacher Support program in Laos, we’ve discussed how the linguistic context in Laos is substantially different from that in our other countries. Specifically, where we normally focus on literacy instruction in students’ national language in Ghana and Guatemala, English and Spanish, respectively, English does not hold a similar status in Laos.

EFL vs. ELF

As a result, our Laos program is best described as literacy instruction in an English as a Foreign Language (EFL) context, rather than English as a lingua franca (ELF). In fact, the Primary 3 students that we serve in Laos are often seeing English for the first time. That has significant implications for what literacy skills we expect to develop across one year of Teacher Support programming.

We’ve previously tried to account for this by using a significantly modified form of the EGRA in Laos. Rather than having equally-sized sections for looking at a students’ proficiency in navigating letters, words and sentences, we added subtasks to provide additional information on the specific skills our students would need at the very early stages of acquisition.

Slide from an internal presentation detailing the differences in assessment structure between Laos and Ghana/Guatemala

Even with those additions, we observed students frequently scoring zero points on many of the subtasks. On the more advanced subtasks (Reading and Writing Sentences), practically every student scored zero points at the end of the year. In response, we started looking at zero reduction — the change in the percentage of students that score zero points on any given subtask between the start and end of the year.

This year, we took a step back. We started to notice the growing gulf between what the EGRA was designed to assess and what the Laos program could reliably achieve — not just over one year but within the first year of English instruction for most. We came to the conclusion that we needed a way to measure what students were learning at the very first steps of language acquisition.

A new test

Over the last few months, we’ve been researching assessment options — options specifically tailored to an EFL context, meant to assess students picking up a new language at the Grade school age range.

Specifically, we conducted a detailed survey of assessment tools that meet specific criteria:

  • They were appropriate for the Primary school age range (6–12 at minimum).
  • They could be used to evaluate English literacy.
  • They could be adapted to align with the local curriculum.
  • There were well-defined, agreed-upon proficiency benchmarks available.
  • They could be deployed on tablet or mobile devices.

At long last, we think we’ve found a promising option: the ASER assessment.

Meet ASER

Developed in the 1990s by Pratham, the Annual Status of Education Report (ASER) is an annual survey developed to provide simple but rigorous estimates of basic literacy levels in children. Its reading assessment was designed to be simple and accessible enough to be independently deployed by parents in households across India in order to reach children both in and out of school.

A sample ASER assessment. (image via asercentre.org)

Children participating in the ASER assessment are assigned a fluency level based on the most advanced task they can reasonably complete: beginner level, letter level, word level, paragraph level or story level.

One key deviation from the EGRA is that the ASER assessment is untimed. While research on timing in assessment is mixed, some educators believe the pressure of reading against the clock could discourage students — particularly those who are slower readers and who rely on contextual clues for comprehension. And even in cases where the timing constraint is not explicitly stated to students, removing that limit potentially gives students with lower levels of fluency the same opportunity to answer questions in full. It’s an important difference we’re looking to explore further as we field-test deploying the ASER assessment.

Other adaptations

Another reason we’re excited about ASER is the global ecosystem of organizations using it and adaptations of it. Like us, many other organizations have found the ASER assessment to be easily adaptable and scalable in country- and language-specific contexts:

  • ASER variations in India, Pakistan and Nepal
  • Uwezo in Kenya, Uganda and Tanzania
  • Beekunko in Mali
  • and many others listed across the PAL Network.

To us, this means a few things. It means there’s an existing body of research on ASER that we can rely on, including guidelines on how to design and administer the assessment, as well as how to analyze the results. It also means we’ll have partners in learning every step of the way with whom we can address common challenges and questions.

More to come

We’re excited by the opportunity to develop a new assessment in-house, and for the new perspective it will bring to evaluating our Teacher Support program in Laos. In the next few months, we’ll continue to develop an ASER-based assessment that is aligned with the local curriculum and grade-appropriate for our students in Laos. We plan to start field-testing in May, and ultimately have a working version ready for deployment in time for the 2018–2019 school year.

Stay tuned for updates on our experience creating a new assessment, our progress with field-testing and what we’ve learned along the way.

Correction: May 8, 2018
An earlier version of this blogpost described English in Ghana and Spanish in Guatemala as second languages. They are more accurately characterized as lingua francas — languages used to bridge groups of various vernaculars or native dialects.

--

--

Pencils of Promise
Transparency Talk

Pencils of Promise provides life-changing education to kids around the world.