This Is a Boring Post About Student Growth Scores on Standardized Tests.

I am only writing it to set the record straight.

Elizabeth Lyon-Ballay
Orchestrating Change
4 min readJun 27, 2019

--

If you want to read about corruption, violence, and dark money, skip this post. If you want plain-language, big-picture blogging about the state of Arkansas education policy, you’ll be disappointed here. However, if you want to know how Henderson Middle School (HMS) students in the Little Rock School District (LRSD) scored so high for “growth” on their NWEA Map Growth tests this year, you’ve come to the right place!

Randy & Diane Zook, with Gary Newton of Arkansas Learns

Arkansas Learns, the “Voice of Business” in Arkansas education lobbying, is run by Gary Newton* — nephew of State Board of Education chairperson Diane Zook, and nephew-in-law of State Chamber of Commerce president Randy Zook. Gary Newton recently congratulated HMS principal Yaa Appiah-McNulty (who will not be returning to HMS next year) for her students’ success on the NWEA Map Growth tests.

Several LRSD teachers responded to these statistics by pointing out NWEA scores do not count toward a school’s “grade” under the state framework for measuring success. The only test which Arkansas actually cares about — the ACT Aspire — has not been scored yet. It is the ACT-Aspire scores that will tell us whether LRSD has earned its freedom from state control.

Faye Hanson

Faye Hanson, one of LRSD’s national board certified teachers, shared a particularly detailed analysis of NWEA tests and how their scoring is meant to be used.

Ms. Hanson, who spent five years administering NWEA tests in New Hampshire prior to her participation in NWEA testing in LRSD this year, explained:

LRSD instructional facilitators and administrators have been training on [NWEA score] interpretation and appropriate use all year. For instance, [NWEA scores] are meant to be used for individual growth and goal setting — not for comparison of students or aggregation for school comparisons.

In fact [gifted and talented] specialists were specifically told to not use NWEA percentile scores on identification profiles because it was not the purpose of the test.

While adjusted for individual students during testing, the standard error of measurement SEM for scores of students performing at the high and low ends are very big — like +\- 6 points. That means that for my gifted students they might not reach or exceed their RIT goal, but their scores fall within the SEM range. This is also true of students performing well below grade level.

RIT score predictions for each student improve with each successive testing with good reliability occurring after 3 administrations so the algorithm has enough data to make a good prediction. LRSD student growth data will be much more useful after the fall 2019 administration when Fall to Fall growth can be seen.

A few teachers (and I) also pointed out that Principal Yaa Appiah-McNulty hadn’t taught those kids herself. Rather, it was the hard-working HMS teachers and good curriculum guidance that made the difference.

Ruth Hooper

One of Henderson’s own teachers, Ruth Hooper, shared her strategy for improving literacy at HMS this year. She posted, “I put my curriculum aside for a week or two to focus on literacy. I hope it helped.”

I am sure it did help! But at a school-wide level, I think Jeff Grimmett’s explanation makes the most sense. Mr. Grimmett was Henderson’s literacy facilitator this past year, and helped coordinate NWEA testing for the entire school.

Jeff Grimmett explained the students’ “growth” quite simply. Apparently, when LRSD teachers were trained on how to administer NWEA tests, they were told students only needed 45 minutes to complete each section. In the fall of 2018, HMS teachers gave students a 45-minute window for each section of the test. Almost none of the students finished the test, and many needed to return to testing several days later, in order to complete the test. Some kids never finished it at all.

After that fiasco, HMS developed a special schedule that gave students longer testing times for the rest of the year. Not all schools did so. Therefore, HMS students had a test-taking advantage as the year went by — showing “growth” in their scores by the end of the year.

But what did we really measure? Not, as Gary Newton suggested in his post, the quality of education received by HMS students. Rather, we measured the willingness of HMS teachers’ to trust their own experience and make independent scheduling decisions, instead of blindly obeying the inadequate “training” they had initially received.

Listen to the teachers. Standardized tests measure test-taking ability — or growth in test-taking ability, if they’re really sophisticated. They don’t teach kids anything, but they do waste a lot of valuable instructional time.

--

--

Elizabeth Lyon-Ballay
Orchestrating Change

Former professional violinist and public charter school teacher. Current stay-at-home mom and agitator for change.