MEGRA: contextualizing our literacy assessment in Guatemala

Pencils of Promise
Transparency Talk
Published in
9 min readApr 17, 2019

By: Julia Carvalho, Manager, Research & Development and Vivi Perez, Manager, Learning & Evaluation in Guatemala

Pencils of Promise (PoP) has been working in Guatemala since 2011, and since the organization has built over 200 schools, trained over 280 teachers and impacted almost 40,000 students. Ensuring the positive impact of this growing operation requires PoP’s work to be thoroughly monitored and evaluated. In order to do that, PoP’s Learning & Evaluation (L&E) team visits our schools on a weekly basis and uses data collection tools that inform us about our efforts and adaptations. One of these tools is a student assessment called the Early Grade Reading Assessment, or EGRA.

Photo credit: Nick Onken

PoP and the EGRA

The EGRA is a USAID-developed tool that we use to assess the development of key literacy skills of our students. The assessment is divided into distinct literacy skill levels: Phonemic Awareness, Passage Reading Fluency and Passage Comprehension — some of the key skills needed to become literate. PoP has used the EGRA to evaluate changes in student performance as a product of the Teacher Support (TS) program in all three countries where PoP operates: Ghana, Guatemala and Laos.

In Guatemala specifically, PoP started implementing the EGRA in 2014. Over the years, many adjustments have been made to the tool so it could include more contextually appropriate content for Guatemalan communities; nevertheless, a modification even more aligned with the Guatemalan context has become necessary. As PoP’s TS program content was being revamped to become more reflective of approaches used in the Guatemalan curriculum, the L&E team needed to make sure that the literacy assessment was accurately capturing student performance.

Desire for a Improved Assessment

PoP strives to ensure our programs and evaluations reflect the unique characteristics of Ghana, Guatemala and Laos; recently, for example, PoP Laos replaced the EGRA tool with another assessment considered much more aligned with the national curriculum. As outlined in a Transparency Talk piece from May 2018, Jonathan Tan, member of PoP’s Graphite: Impact Council, highlighted that using the right measurement tool is critical in the field of program evaluation.

Since 2017, PoP Guatemala’s TS team has worked to ensure every component of the program is aligned with the National Base Curriculum (CNB). This included an audit on teaching techniques shared with teachers and ensuring books provided to students were high quality and age appropriate. PoP’s extensive efforts in fine tuning and improving the TS program meant that evaluation tools also needed to be adjusted.

In order to properly evaluate the program, PoP’s literacy assessment tool for Guatemala needed to be more contextualized to the Spanish language and the CNB. So, we set out to modify it.

Photo credit: Nick Onken

The Modification

This entire process started with a simple question: are there alternative literacy assessments for primary aged students that meet PoP’s desire to match Guatemalan contexts and curriculum requirements as compared to the standard EGRA? After thorough research and internal discussions, the final answer was: No.

However, during this research, the L&E team came across an exercise closely related to the new task at hand. In 2014, specialists from the Guatemalan Ministry of Education (MINEDUC) in partnership with USAID developed two EGRA-based tests fully adapted to the country’s context: the ELGI (EGRA adaptation to Spanish) and the EESNAJ (EGRA adaptation to K’iche’, one of Guatemala’s many languages). PoP was not able to use either of these tests due to the Guatemalan government having rights to distribution and administration of the test. However, counterparts within the MINEDUC were accommodating in PoP’s request to learn more about the creation of these tests and provided the L&E team with extensive documentation. The entire process that these specialists went through was thoroughly recorded in a technical report that served as the foundation for our own modification efforts.

Research & Development

Using the ELGI and its documentation as the starting point, PoP’s L&E team set out to create our own version of the EGRA. During a four-month period, L&E teams in Guatemala and NYC conducted further research and consulted specialists from the MINEDUC and USAID with the goal of aligning PoP’s literacy evaluation tool with TS programming and the CNB. Extensive review was given to PoP’s existing evaluations, curriculum standards for student ability levels that should be developing throughout a primary education, literature on the Spanish language, grammar, letter and phoneme frequencies, as well as PoP’s TS content reinforced in workshops and coaching sessions.

In this process, the key objective was to re-evaluate our tools and identify the literacy abilities that were not being measured in the existing assessment tool (see Table 1 for examples of literacy abilities students should develop to become literate across various grade groups). After completing this task, we conducted additional research on the best ways to measure these abilities and develop a test applicable to schools receiving PoP TS.

Table 1: Key elements for the learning of reading and writing from USAID-MINEDUC guide.

Using CNB documentation, a diagnostic was created to determine which reading and writing content the current EGRA was evaluating in each primary grade, as determined by the Guatemalan curriculum. Through Spanish language literature, we used the Sandoval phoneme frequency table to determine which letters and sounds should be included more often in the test (phoneme awareness). Furthermore, the curriculum-based evaluation (EBC) provided us with contextualized stories that could be used for our reading and oral comprehension sections. Analyses of each element enabled the L&E team to source information that would help in the development of a finely tuned assessment, and these new methods and externally sourced details were constantly mapped on existing TS program approaches to ensure consistency.

After this in-depth process, results revealed reading skills that needed to be measured and specific sections that needed to be included on the new test.

The New Test

Since the original EGRA served as the foundation for the new test and modifications were made for improved alignment, it was determined that PoP’s redesigned assessment would be named the MEGRA (Modified EGRA). The following table is a comparison of the formerly used EGRA and PoP’s newly developed MEGRA.

Table 2: EGRA vs. MEGRA

Description of subtasks:

  • Letter Name Recognition: Student is asked to correctly identify the name of each letter of the alphabet.
  • Phoneme Awareness: Initial Sound Identification: Student is presented with a word orally and asked to isolate and pronounce only the first sound of the word
  • Phoneme Awareness: First Syllable Recognition: Student is presented with a word orally and asked to pronounce its first component phonemes (or syllable)
  • Familiar Word Identification: Student is presented with a list of words they have likely encountered before and asked to read them.
  • Oral Reading Fluency: Student is asked to read a small passage and is assessed in terms of speed and accuracy. This subtask is timed.
  • Reading Comprehension: Measures a student’s comprehension based on the passage that was read aloud for the oral reading fluency subtask. After student reads the passage aloud, they are asked 10 comprehension questions — literal, inferential and critical — that can be answered only by having read the passage.
  • Writing: Students are asked to listen to a short single sentence and then write it down. The subtask measures students’ alphabet knowledge and ability to spell (encode) words correctly and to use correct sentence-writing conventions such as capital letters, punctuation, directionality, and spacing.

Added subtasks:

  • Unfamiliar word recognition: This nonword reading subtask presents the children with a written list of pseudowords that follow the phonological and spelling rules of the language but are not actual words. This provides indirect insight into students’ ability to decode phonemes and read unfamiliar words.
  • Oral comprehension: Assessors read to the student a short story on a familiar topic and then ask five comprehension questions about what they heard. The listening comprehension subtask is used primarily in juxtaposition with the reading comprehension subtask (see Comprehension section above) in order to test whether comprehension difficulties stem primarily from low reading skills or from low overall language comprehension.

Removed subtask:

  • Orientation to Print: This subtask was deemed too basic. According to the CNB, the abilities of this section are taught specifically in Pre-K, first, and second grade. Since PoP works in first through sixth grade and desired a test for all grades, this section would often be too easy and uninformative for us to include during a time-sensitive assessment with primary students. Similar abilities are evaluated by other subtasks (e.g. the ability of “directionality” is also evaluated in the writing subtask).

Other Relevant Changes:

In previous years, PoP chose the option to not allow students to refer back to the reading passage when answering the questions of the reading comprehension section. For the EGRA modification, PoP’s L&E team saw an opportunity to refine how reading comprehension would be defined. Does allowing students to refer back to the story serve as a more effective measure of comprehension compared to removing the passage, which might conflate recall and comprehension? After an analysis, we concluded, yes. It was determined that more focus needed to be on what the student comprehended from the story rather than recalling specific details and, therefore, it was decided that students can refer back to the passage while questions are asked. To reinforce the measurement of reading comprehension, we included questions in three levels: literal, inferential, and critical questions, which comprise the three levels of reading comprehension.

Photo credit: Nick Onken

Piloting the New Test

Two pilot tests were implemented with students from the first to sixth grade. During four days of pilot testing, 24 students were assessed and students represented the following language groups which are included in communities served by PoP: Bilingual Mayan, Bilingual Spanish, and Monolingual Spanish. Based on the piloting results, the following edits were made to the test:

  • Initially the letters lowercase “L” and uppercase “I” followed each other with lowercase “L” showing up first. We noticed that students mistook the lowercase “L” for an uppercase “I” to later realize it was an “L”. With that, we decided to change the order they appear and ensure they were not placed next to one another.
  • During the pilot, doubts arose regarding the use of the word “Dedo” (finger in Spanish). In Spanish, the sound of letter “d” and the sound of the combination of d+e is the same, making it hard to identify if students knew the sound of “d” or were just answering the name/sound of the first syllable. We decided then to replace the word with “Dado” (dice).
  • During the pilot, it became clear that students from one of the language groups had difficulties answering one of the inferential comprehension questions. To address that, we decided to slightly modify the passage by changing one of the foods mentioned (worms to rice) to accommodate a common understand within the language group.

Lessons Learned & Next Steps

After the completion of a rigorous and intensive modification process, the MEGRA was ready to be implemented for baseline data collection at the beginning of the 2019 Guatemalan school year. PoP’s L&E team worked hard to modify the literacy assessment and learned valuable lessons along the way.

This process reinforced the importance of contextualizing evaluation tools as well as the piloting of tools before using them. The research produced, the conversations had with specialists, and the takeaways from piloting, all highlighted the uniqueness of communities where PoP works and the need to ensure that in PoP’s programs and evaluations are tailored to specific needs.

This process also reinforced the importance of teamwork and alliances. This adaptation was a cross-country, cross-departmental effort and its success rests in the collaboration that went into its production. It was an incredible endeavor and PoP’s teams believe programs and evaluations are better served as a product of the process.

--

--

Pencils of Promise
Transparency Talk

Pencils of Promise provides life-changing education to kids around the world.