We differentiate for students, why not for schools?
ANet recently completed a randomized control trial through the Department of Education’s i3 program. The trial helped us learn more about how the unique combination of tools and coaching we provide to our partner schools can help them build strong practices and support deeper student learning. There is so much we learned from this study, and it’s really shaping the way we work today. Today, though, we are just going to focus on one lesson.
Our big headline, if you will, is this: Schools deserve differentiation the same way students do. In our study, schools that partnered with ANet and had the right “readiness” conditions in place significantly outperformed their matched pairs in student achievement gains in both math and ELA. This outperformance translated to an additional four to seven months of learning over the two-year study period for students in those schools.
Students in schools with the right conditions in place achieved four to seven months of additional learning.
Before the i3 study began, we asked schools interested in participating to complete a questionnaire to gauge their readiness to partner with ANet. We were very structured about this — we didn’t want to identify schools with especially skilled educators or that were “higher capacity” as it is often referred to in the literature. Identifying these schools would not help us with our mission to help all children. Rather, we wanted to know what conditions needed to be in place for schools to benefit most from our support.
Specifically, we asked: 1) whether or not integrating standards and data more fully into teachers’ planning was part of a clear, manageable set of priorities in each school, 2) whether or not the school had set aside time for its teachers to make meaning of the data and take action together, 3) whether the school had established a leadership team to guide and lead the work, and 4) whether the district supported this priority and was aligned on the school’s need for the partnership with us.
Many of these conditions probably seem intuitive to many people. But we often ask schools to take on the use of assessments when these conditions are not in place. Just take that first condition we named as an example — how many times have interim assessments been added as number four or five on an already-too-lengthy list of priorities at a school? How many times have they been added to a school’s to-do list without asking teachers and school leaders whether or not they fit their priorities?
For us, the finding that certain schools with certain conditions benefit most from our support confirmed a truism that we all know but that we don’t always live by: there are no silver bullets. Providing each school with the same thing won’t get the same results everywhere, and we shouldn’t expect or require the same changes in all schools. As we’ve worked alongside schools since our founding, we’ve seen that each school is different in terms of people, priorities, and place in time. And we see that we’ve got to differentiate our support for schools — just the way that educators do for their students.
There are no silver bullets. Every school is different in terms of people, priorities, and place in time. We’ve got to differentiate our support for schools — the way educators do for students.
These findings from our i3 trial have galvanized us as an organization to get clearer internally, and with our system partners, about WHEN a school is best positioned to focus on this work and HOW we all build the conditions to help schools improve. The first thing we are doing is taking more time to understand each school before we partner. This means seeing classroom teaching taking place. It means meeting with both leaders and teachers to understand their priorities and past successes and areas for growth. It means ensuring the school understands what our partnership entails, from a time and people standpoint as well as culturally. We have started using a set series of questions to help us do an even better job understanding each school, aligning with the school on what we see, and understanding how we can best help the school given its context.
We are also more focused than ever on giving schools a partnership that aligns to their areas of prioritization. We hear all the time that effective interventions should be implemented in all schools at once. If it works, the logic goes, then replicate and scale it. This concept is often well-intentioned: “We believe in equality,” many of our partners say to us. “Every school should be working with ANet.” But we’ve learned that schools don’t need equality — they need equity. Equality means giving every school the same thing. Equity means giving each school what it needs.
For us, this has come to mean that the first year of our partnership might focus on planning from standards and putting in place the right PD for teachers to internalize the Common Core, rather than use of our interim assessments. In fact, at the beginning of our i3 trial upwards of 90% of our coaching time with school leaders was focused on use of interim assessment data. Today, we differentiate our coaching much more to meet the needs of the school, and only about 40% of our time is focused on helping partners use data and assessments, while 60% is focused on helping them build expertise in standards, set priorities, support teacher development, and other needs specific to the context of the school.
At the beginning of the trial, 90% of our time was focused helping schools use data. Today, we differentiate more based on schools needs, and only 40% of our time is focused on data, while 60% is focused on building expertise in the standards, setting priorities, supporting teacher development, and other school-specific needs.
Finally, we are working on building the conditions for success when they aren’t in place. These findings do not make us think we’re going to work only with certain schools that have all these conditions in place. If we did that, it would feel like we were “creaming.” It would also feel like we weren’t standing behind our conviction that helping many schools isn’t enough — we want to help all schools. So rather than partner only with schools that have the right conditions in place to begin with, we are starting to work with school leaders and system staff to put these conditions in place where they are missing. Sometimes this means figuring out how to set aside consistent collaborative time so educators can focus on this work together. Sometimes it means working with system and school-based staff to ensure alignment between curriculum, instruction, and assessment.
We still have a lot to learn here and we will keep studying this closely alongside our school partners: will our efforts to help establish these conditions lead to better results when we deliver our full partnership? Or, will we learn that sometimes these conditions are better developed by schools, systems, or other partners without ANet? If we learn from our data the circumstances where others are better suited to help put the conditions in place, can we know enough about what’s available to recommend the right strategies?
Too often our efforts to implement promising practices focus on supervision of process over purposeful implementation. None of us wants to see schools miss out on promising ideas that we know can help students. But when we try a one-size-fits-all approach to school support, without regard to the priorities or context of the school, we end up with process over purpose. The findings from our i3 trial are helping us communicate better with our partners about the conditions under which purpose can win out over process.