How to Use Data to Drive Multi-Tiered Instructional Decisions

Part 3 of our blog series, Multi-Tiered System of Support for Math Success

McGraw Hill
Inspired Ideas
7 min readJan 8, 2020

--

By Bradley Witzel, Ph.D., Winthrop University

Ben Clarke, Ph.D., University of Oregon;

and Paul J. Riccomini, Ph.D., The Pennsylvania State University

For teachers who are using a Multi-Step System of Supports (MTSS) to assist students struggling in math, making progress may in fact mean taking a step back to the beginning. Where were the students at the beginning of the year, the month, the lesson? Where are they now? How far have they come in that time, and how far do they need to go?

In the first two blogs of our three-part series, Multi-Tiered System of Supports for Math Success, we have explored MTSS methodology, and how intensified adjustments to learning time, content emphasis, teacher-to-student-ratio, and instructional approach can help get students on track.

In the final part of this series, we are going to discuss data, and how screening and progress monitoring can be used to make instructional decisions based on the individual needs of students. Read on to learn how data is a crucial component of an effective MTSS strategy.

Data Driven Decisions

Within a multi-tiered system of instructional delivery, there are numerous critical junctures during which the collection of data can be used to inform instructional decision-making. While there are a myriad of ways that schools can utilize assessment data, we consider three to be foundational to effective MTSS implementation. These primary data uses include:

  • Screening to identify students in need of additional support (Alberts & Glover, 2014)
  • Progress monitoring to examine student growth and response to intervention (Hixson, Christ, & Bruni, 2014)
  • The use of screening data collected from all students at multiple points across the school year (benchmark data) to gauge the overall health of the instructional system (Clarke et al., 2014)

Screening All Students

Screening provides a quick, efficient check of a student’s performance in a content area. Screeners are designed to be efficient, so they are often of short duration, timed, and easy to administer and score (Gersten et al., 2009). In mathematics, screening tools focus on foundational number sense skills in the early elementary grades (Gersten et al., 2012) and more advanced computational procedures and concepts in the later grades (Foegen et al., 2007).

Screening data indicate those students who, in the absence of intervention, may be at-risk of not meeting key criteria (e.g. passing a state test) at a later point in time. It is critical for the screening process to be linked to the provision of intervention services. Once screening has identified which students are at-risk or off-track, and those students receive intervention services, the second critical use of data comes into play: progress monitoring.

Progress Monitoring and Data-Identified Students

Progress monitoring provides a mechanism to gauge a student’s response to intervention. While the progress-monitoring process is complex, the basic procedures involve setting a long-term goal, determining the frequency (e.g. weekly or monthly) of data collection, and collecting and evaluating data using a set of decision rule to determine responsiveness. Measures used in progress monitoring are designed to be repeatable so that the measure used stays constant over time and any change in performance is due to an increase or decrease in a student’s achievement.

While the progress-monitoring process is complex, the basic procedures involve setting a long-term goal, determining the frequency (e.g. weekly or monthly) of data collection, and collecting and evaluating data using a set of decision rule to determine responsiveness.

In mathematics, progress monitoring can be augmented with program-embedded assessments. Such assessments provide information on whether students are mastering the skills taught in an intervention. That information can be used in conjunction with the broader view of growth provided by progress-monitoring data. For students who fail to make adequate progress, the intervention should be modified to increase the overall intensity of intervention services.

Informed, Data-Based System Evaluation

Finally, data can be used in a MTSS to evaluate how the overall system is supporting the achievement of all students. Engaging in systematic program evaluation can help guide the allocation of finite resources.

Say, year after year, your screen data show a significant number, like 70%, of students in a certain grade are at-risk. Your school may decide that allocating resources to Tier 2 services may not be the best approach for addressing student needs. Instead, your school could consider putting in place the following approach:

  1. Addressing core instruction, pursuing enhancements as minor as adding more practice opportunities to core lessons, to as major as purchasing a new core program.
  2. Evaluating efficacy by examining changes in the performance of student groups before and after the modification. Your school would look to see if the number of at risk-students decreased from 70% in year one to lower percentages in years two and beyond.

Engaging in this process at key intervals can help ensure a robust system of evaluating support and making substantive changes as needed to meet the learning needs of all students.

Conclusion

An effective core mathematics program is necessary for improving student performance outcomes. However, more is needed. A comprehensive MTSS program is necessary to provide a highly-connected system of intensified instruction for students who are struggling. Such a program is informed by screening and progress monitoring, and includes intensive interventions that vary across academic learning time, teacher to student ratio content emphasis, and instructional approach. With persistent and purposeful effort, all students can access complex mathematics that affect them academically, socially, and professionally.

References

Albers, C.A. & Kettler, R.J. (2014). Best practices in universal screening. In P.L. Harrison & A. Thomas (Eds.) Best Practices in School Psychology-VI: Data-Based and Collaborative Decision Making, (pp. 121–131), Washington, DC: NASP.

Clarke, B., Doabler, C. T., & Nelson, N. J. (2014). Best practices in mathematics assessment and intervention with elementary students. In P. Harrison & A. Thomas (Eds.), Best practices in school psychology: Data-based and collaborative decision making (6th ed., Vol. 1, pp. 219–232). Bethesda, MD: National Association of School Psychologists.

Foegen, A., Jiban, C., & Deno, S. (2007). Progress monitoring measures in mathematics: a review of the literature. The Journal of Special Education, 41(2), 121–139.

Gersten, R., Beckmann, S., Clarke, B., Foegen, A., Marsh, L., Star, J. R., & Witzel, B. (2009). Assisting students struggling with mathematics: Response to intervention (RtI) for elementary and middle schools (NCEE 2009–4060). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides.

Gersten, R. M., Clarke, B., Jordan, N., Newman-Gonchar, R., Haymond, K., & Wilkins, C. (2012). Universal screening in mathematics for the primary grades: Beginnings of a research base. Exceptional Children, 78, 423–445. Retrieved from http://cec.metapress.com/content/ B75U2072576416T7

About the authors:

Dr. Bradley Witzel

Dr. Witzel is a Professor and Program Director in the College of Education at Winthrop University. His main areas of research focus on empirically-validated practices with students with disabilities and at-risk concerns, particularly in the areas of mathematics and MTSS. A popular author and professional developer, he has written several books and delivered several hundred workshops, confernces, and video presentations on instructional interventions.

Dr. Ben Clarke

Dr. Clark is an Associate Professor in the School Psychology Program at the University of Oregon. He currently serves or has served as a Principal Investigator on 15 federally-funded research grants (approx. 50 million in funding) in mathematics instruction focused on the development and efficacy-testing of intervention programs spanning the K-6 grade spectrum in both traditional and technology-based formats. Dr. Clark was a practicing school psychologist for three years, during which time he led district efforts to implement mutli-tier instructional models in reading and mathematics.

Dr. Paul J. Riccomini

Dr. Riccomini is currently an Associate Professor of Special Education at the Pennsylvania State University. He began his career as a dual-certified general education mathematics teacher of students with learning disabilities, emotional and behavioral disabilities, and gifted and talented students in Grades 7–12 in his inclusive classrooms. His current research interests focus on the application of evidenced-based instructional practices and interventions in MTSS/RTI framework for students with disabilities and struggling students in mathematics. Additionally, Dr. Riccomini provides high-quality professional development focused on effective mathematics instruction to school districts across the United States.

--

--

McGraw Hill
Inspired Ideas

Helping educators and students find their path to what’s possible. No matter where the starting point may be.