As my colleague Heather Yuhaniak (@equitywarrior on Twitter) recently stated, in a discussion about how depressing it is to look at education policy from a systems level,
There is a “disconnect between our stated ideals and our practices.”
We certainly claim that improving the state of education in our country is a priority, but not only does that not play out in all of our policies. We implement new concepts (Curriculum 2.0, Ready for the Core) without even waiting to see the results of the initial intervention of the first one. We ignore what the research tells us and continue doing more of the same (we know desegregation works, and yet, schools are still extremely segregated). The trend is that of policy not supporting interventions, no matter how effective, so naturally, when I look at policy, I feel frustrated. My desire to stay optimistic demands that I look for examples of exception in our country.
So… where are they?
- In a country in which our ideal is equality, why are schools still so segregated?
- Why are there not reforms to the academic calendar given what we know about the effects of non-school and summer factors?
- Why is multicultural education not a standard?
- Why aren’t students given equal access to high quality education?
- Why do the highest needs schools have the least access to the highest quality teachers (Glazerman et al., 2013)?
I recently read Jon Fullerton’s (2015) essay on entrepreneurship and evaluation of school interventions, I found myself feeling a little cynical about the plausibility of developing a full-scale cooperative evaluation infrastructure. Fullerton himself acknowledges setting up such an infrastructure would not be easy, but like him, I agree that the benefits would be great for all stakeholders: school leaders and teachers would gain easy access to valuable evaluation data, policy makers could demonstrate that there is, in fact, quality control over programs and interventions entering schools (and could be protected from arbitrarily making selections — or being held accountable for these), and entrepreneurs could benefit by getting regular direct evidence on what is or is not working with their products (Fullerton, 2015). Further, funds in education are limited, and as Andreas Schleicher (2012) points out in his TED talk, how those resources are spent is important. There is no denying that effectively captured and analyzed data is critical for understanding and dissecting student student achievement (Achievement Network, 2015).
In Schleicher’s (2012) TED talk, in which he touts the virtues of PISA, an internationally employed measure of knowledge and learning, serves as an example of a story in which a large scale concept was successfully implemented. In the TED talk, Schleicher asserts that by measuring students’ knowledge and skills directly with PISA, we are able to establish a baseline for standards of education internationally (Schleicher, 2012), which is promising. However, I worry that while there are many instances and stories that suggest interventions can make a massive difference, this seems to be an exception and not the rule.
However, my own experience in trying to make things happen in school tells me that the difficulty in developing such a network in the first place, and then maintaining school or district commitment to it, would be a barrier. My view reflects an underlying assumption that even if we can build stakeholder buy-in on the school and district levels, bureaucracy, as well as the difficulties in coordinating large scale cooperation, would prevent actual change from occurring.
My cynicism about how realistic such an infrastructure is, however, is not unfounded; the massive resistance (and time-consuming, slow, and lengthy paperwork and process that was required) in trying to initially conduct survey research for my own dissertation in my own school district tells me that bureaucratic resistance is a real and problematic obstacle. My understanding of the repetitive, cyclical, and at times seemingly hopeless nature of school reform efforts (Tyack & Cuban, 1995) tells me that my own experience is not a one-off, and is also reflected on a larger, policy-level scale.
I acknowledge the danger in my underlying cynicism has the potential to seep into my own work in education by making me less likely to take risks; this is something I realize I have to work through. I think that by focusing on those stories (albeit few and far between) that inspire hope might keep me optimistic.
Achievement Network. (2015, October 21). i3 Study Takeaway 2: Data and assessment are critical tools — but they can also be distracting [Web log post]. Retrieved from http://www.achievementnetwork.org/anetblog/2015/10/21/i3-study-takeaway2
Fullerton, J. (2015, June). But does it work? Evaluating the fruits of entrepreneurship. Paper session presented at the meeting of the American Enterprise Institute, Washington, DC.
Glazerman, S., Protik, A., Teh, B. R., Bruch, J., & Max, J. (2013). Transfer incentives for high-performing teachers: Final results from a multisite randomized experiment (№4269bc8810414c8a8f64d3c36fde8211). Mathematica Policy Research.
Tyack, D. B., & Cuban, L. (1995). Tinkering toward utopia. Harvard University Press.
Schleicher, A. (2012, July). Use Data to Build Better Schools. TED Talk.