Preview of New Tool for Schools: Ed Tech Rapid Cycle Evaluation (RCE) Coach

Office of Ed Tech
7 min readOct 26, 2016
Photo credit: CCSD59

Districts and states are spending millions of dollars buying educational apps (technology applications, tools and platforms), many of which have minimal evidence supporting their effectiveness. Additionally, once a purchase is made, there is often no systematic process for reviewing the effectiveness of ed tech tools before renewing contracts, which collectively can run into the millions of dollars. Thanks to the many participants in the President’s ConnectED Initiative, 20 million more students have access to broadband at school than just three years ago making it possible for more districts, particularly Future Ready districts, to shift their teaching and learning practices to take advantage the transformative power of technology. It is fair to say that we have vastly increased our technical capacity to enable high-quality digital learning in our schools. However, our understanding about what works in what context has not kept pace.

Recognizing this need, a year ago we called for better methods for evaluating educational apps to help states, districts and schools have more transparency into what technology was really making a difference for teaching and learning. The need to make good decisions based on evidence, as opposed to relying on marketing hype or the buzz among a small group of peers, is critical.

This month we are previewing the beta version of our Ed Tech Rapid Cycle Evaluation (RCE) Coach, a free, openly-licensed, web-based platform that helps schools and districts make more informed ed tech purchasing decisions. Created in partnership with Mathematica, the RCE Coach guides educators step-by-step through an ed tech purchasing or renewal process. It starts when a school or district selects an ed tech product or tool that they want to evaluate. The RCE Coach then provides resources and scaffolding to guide and support practitioners in defining their desired outcomes, designing effective pilots, conducting their own evidence gathering, and analyzing results to make evidence-based purchase and renewal decisions. The goal is to fundamentally change the procurement and implementation process to include a continuous cycle of evidence-based decision making and to help states and districts spend millions of dollars more effectively.

This RCE Coach will help educators determine whether the technology currently used or planning to be used in the future achieves its goals — for example, whether it is moving the needle on student achievement or teacher satisfaction. On this basis, districts, schools and practitioners can better determine whether to continue using a technology that they now have evidence for or to stop using a technology that is not effective.

The RCE Coach is not the same as having a researcher working with you, but is intended to help educators approach evaluations in schools more like a researcher and to help implement some of the methods and approaches researchers use.

What’s Available Now

During this preview stage, we are releasing a beta version of a matched comparison group research design tool that helps educators analyze existing administrative data. What is a matched comparison design? Glad you asked! Many schools would like to engage in rigorous research, but find that conducting a randomized trial is impractical. For example, most schools are not in a position to disrupt classroom and teacher assignments to create two or more groups of students, chosen randomly to be in each group, where some of those students receive an ed tech intervention and some do not. As a result, many schools give up on trying to do a rigorous study, because they are uncertain if their conclusions can be trusted without randomized treatment and control groups. While random grouping is preferred (and required for randomized control trials (RCTs)), there are other rigorous methods of creating comparison groups for research. One of these is called a “matched comparison design.” A matched comparison design attempts to create groups that are as similar as possible from existing groups — for example, students with similar demographics and experiences but that attend several similar schools, two or three groups of students where the intervention is occurring and two or three groups where it is not. It does this by using data on characteristics of students, teachers or schools to create groups that are similar on key factors that may be related to outcomes.This allows for rigorous comparisons without reshuffling class rosters.

The key audience for this first design release will likely be district-level administrators and some school-level administrators with access to this kind of data.

If you have access to this type of data at your district or school and are ready to evaluate a technology you are already using, the Ed Tech RCE Coach can help you now. Start here!

Coming in January

In January we are scheduled to add two additional research designs that will help districts and schools who are either starting pilots of ed tech solutions that they have already identified, or that have identified issues and are looking to select and pilot ed tech solutions. The new updates will expand the platform’s capabilities to include forward-looking designs that guide district and school-level educators through the identification of apps, crafting a research question, conducting an effective pilot, and analyzing the results.

If you are piloting a new tool and would like the support of our new RCE Coach and research team, please take a minute to complete this short form.

Insights from the Field

Often achievement data isn’t available until well after typical renewal or purchasing deadlines.

The RCE Coach can provide ongoing data more frequently with timelines more aligned to current budget cycles. “The budget cycle is not school-year based. We make budgetary decisions in winter and February, and if we wait until May to see efficacy data, it’s too late to make a decision based on that data,” said Jessica Peters, Associate Director of Personalized Learning, KIPP DC Headquarters. “A 3-month pilot period with RCE Coach will enable us to evaluate effectiveness before the purchasing cycle starts.”

Former associate superintendent of Springdale Public Schools Marsha Jones recognizes that “a tool like the RCE Coach can help validate and affirm that we’re making a wise investment, given the limited time and money districts have.”

“Educators need to make informed decisions and a resource like the RCE Coach can be a really useful tool to inform big decisions,” shared Marsha Jones. “Many decisions we make are expensive in time and money, and we need to be good stewards of those investments. A tool like the RCE Coach can help validate and affirm that we’re making a wise investment, given the limited time and money districts have.”

Peters also recognized the the benefit of RCEs for local “buy-in”: “When we share results from an evaluation with our own students, our staff is more invested because it came from within.”

The process also encourages schools to examine more precisely what they want to change and what that change might look like. Emily Tucker used an early version of the RCE Coach in her school and says that the process helped them make more disciplined and informed decisions about their ed tech choices. “My recommendation is to not skip or skim any of the up front part of the process where you’re identifying key questions because it was a big piece of learning for us. Internally we’re rapidly growing and changing all the time, and we don’t stop to dig in on what are we actually changing and why. The RCE process forced us to be really clear on vision and outcomes,” says Emily Tucker, Race to the Top Project Manager, from IDEA Public Schools. Using the tool also builds capacity and embeds best practices, such as developing a continuous improvement cycle of evidence. “The hope is that once we get through a few trials, we’ll focus on just constantly being in RCE cycles until we feel like we really have data on efficacy,” continued Peters.

Connections to What Works Clearinghouse and new ESSA Evidence Guidance

These rapid cycle evaluation efforts build on other work from the U.S. Department of Education. The recent Find What Works updates to the What Works Clearinghouse, for example, make it easier to identify which interventions, programs, and policies have improved student outcomes across the approximate 1,000 effectiveness studies registered. Unfortunately, there aren’t enough resources to implement full-scale randomized control trials on every educational technology in use in schools. Existing approaches are too expensive and time consuming to be viable for every application, platform or other tool out there. Thus, there is a pressing need for low-cost, quick turnaround evaluations to help educators make decisions more quickly and inexpensively.

Different approaches yield different types of effectiveness evidence. In recognition this, the U.S. Department of Education also recently released guidance, Using Evidence to Strengthen Education Investments, to help states, districts and schools successfully choose and implement evidence-based activities, strategies, and interventions that improve outcomes for students. This guidance recommends considerations, resources, and criteria for identifying “evidence-based” interventions based on each of ESSA’s four evidence levels and highlights the importance of incorporating a continuous improvement cycle.

Looking to the Future

We will continue to add new evaluation instruments and research designs to the RCE Coach to expand the kinds of applications, tools and platforms that educators evaluate. For examples, teachers will be able to run small cycles of evaluations in their own classrooms, schools can determine how effective a professional development platform is for their teachers, and districts can learn whether or not a productivity tool lives up to its promises and meets their specific needs.

As we continue to work with states, districts, and schools, we are also building a bank of evidence-based evaluations that will be available to other districts, schools, teachers and researchers to disseminate knowledge across the field about rapid cycle evaluation best practices and to share exemplars to serve as models.

We will also continue to foster partnerships among the educator, ed tech developer, and research communities to better understand implementation contexts, encourage the development of better tools and help districts find better matches for their specific needs.

If you would like to receive an update when the new designs and tools are released, add your name here.

Learn more about our Educational Technology Rapid Cycle Evaluations work here: tech.ed.gov/rce.

is the Deputy Director of the Office of Educational Technology.

--

--

Office of Ed Tech

OET develops national edtech policy & provides leadership for maximizing technology's contribution to improving education. Examples ≠ endorsement