Journey to Clarity: The Treasure Trove of Student Data

Shweta Gandhi
7 min readSep 20, 2023

--

Note to reader: I am the founder and CEO of Strived.io and this article details out my personal learning journey that led us to where we are today.

Strived.io was founded on the belief that, while standards-based mapping has traditionally been a complex endeavor, advancements in AI now allow for precise mapping of each student’s data within this intricate framework. By providing satellite and map-level insights on demand through natural language, we can offer a clearer understanding of every child’s learning journey. Yet, we recognized that this data alone isn’t sufficient to truly personalize instruction. That’s why Strived aims to harness educator observations and insights via AI. By marrying this rich, on-the-ground perspective with satellite data, we aim to revolutionize differentiated instruction, ensuring that every student receives the targeted support they need.

For several years, I spearheaded the creation of a blended learning data tool for a renowned national charter. Our primary goal was to ensure students actively used a select set of finely-tuned ELA and Math practice tools. And you know what? We nailed it. From an overwhelming 40 tools, we streamlined to just 5 for grades K-5, with not every grade using all tools at once. The belief was simple: use these tools right, and we’ll see a difference in learning outcomes. But the snag was, all this was happening in isolation, away from the main classroom content. Sure, there was a general feeling of “it’s doing something,” but the impact on student growth through assessments etc. remained largely unmeasured. However, given the extended day model, this approach still seemed miles ahead of any other alternatives.

Many schools I visited inside and outside of the network seemed to treat blended learning as a mere time-filler. Even if teachers ever expressed leveraging the data for classroom instruction (a very rare ask!), they often found themselves without access to anything that was tangible at the classroom level.

It dawned on me that, despite its widespread acceptance, edtech often struggled to earn full trust from many top-tier educators. Sure, they felt it made a difference, but the challenge lay in translating these digital tools into actionable classroom lessons. The assumption? That these platforms naturally filled in the learning gaps for students through self remediation. But, if you’ve ever watched a young child navigate an edtech tool, you’ll know that genuine progress often requires a more hands-on approach.

I dove into the tools, eager to pull data that could actually benefit a classroom. Here’s the reality I bumped into:

1. Granularity Matters — Most tools weren’t diving deep. They’d point out that a bunch of kids struggled with a specific standard. But the nitty-gritty of what stumped them within that standard? That part stayed hidden. And the top tools? They often just nudged teachers and students toward more of their product. Can’t blame them, really; they’re in the game to boost usage, not to cut screen time and let a teacher decide a student would benefit more from being offline and in front of a human.

2. Efficacy Woes — Measuring a tool’s effectiveness is tricky. To truly gauge it, you need to use the tool exactly as planned. Many providers have conditions: spend “X” hours with us, or finish “Y” percent of the curriculum to see real benefits. Even at tech-forward schools, these conditions were seldom met. If I ran an edtech company, I’d think twice about publicizing an efficacy study when I’m uncertain about proper product usage.

3. Guarding the Data — I recall a chat with an exec from a leading edtech firm. He somewhat jokingly remarked it was his job (it really wasn’t) to play “frenemy” to anyone asking for his product’s data. When pressed, he explained: if you interpret and present my data in a light that doesn’t match my narrative, especially in a negative shade, why would I share it?

Photo by Kelly Sikkema on Unsplash

I walked away from these revelations with a mixed bag of understanding and frustration. But one thing I was convinced of: edtech data had the potential to be gold. Digging it out, though, would require a shift in the entire incentivization system of the world of edtech today.

I started hunting for sources of student data that could truly showcase their learning journey. It was Street Data, the book, that made me see data in three distinct lenses. As described in the book:

  1. Satellite data includes things such as test scores, attendance, and graduation rates that tell an important story illuminating big performance trends and point toward underserved student groups.
  2. Map data hovers closer to the ground providing a GPS of learning trends and gaps in school communities. Map data could include Lexile levels gathered through running records assessments, rubric scores on a common math assessment, or student perception data gleaned from a schoolwide survey.
  3. Street data takes us down to the ground to listen to the voices and experiences of our students, staff and families. It provides us with real-time leading indicators on the messy work of school and instructional improvement while enabling rapid feedback loops for our decisions and practices.

Capturing street data, the granular day-to-day insights, is a beast. Marrying tech with human observation? That’s a puzzle begging for a solution. Still, before diving into street-level details, we needed to grasp the broader satellite and map perspectives.

With the team, we first mapped out student skills and standards to clarify learning expectations. Thankfully, there are brains much brighter than mine in this field, providing clear maps of standards and skills, segmented by grade and subject.

Around this time, parent-teacher conferences for my kids — one in preschool, the other in elementary — threw me for a loop. For my youngest, feedback was tangible: “He needs to learn how to use a spoon with more control.” Got it. But for my elder one, I got: “She’s satisfactory in her phonemic awareness.” Uh, come again? I was lost on where to even begin assisting.

Digging deeper into these mapped skills and standards, especially as children advanced in grades, was mind-boggling. Yet, a nagging thought persisted: There’s got to be a clearer way to grasp where each child truly stands in their educational journey, if not for parents, but at minimum for their teachers.

Our initial steps involved crafting an AI backend to interpret standards and skills. That part was straightforward. Our focus then shifted to gathering satellite and map-level data, honing in on the profile of an 8th-grade ELA student.

In diving into assessments like NWEA, STAR (both formative diagnostic assessments), or SBAC (the California state assessment), we saw they provided a glimpse into broader subject areas of difficulty, but specifics on standards? Not so much. The reason? These tests, featuring 40–50 questions, were crafted to gauge mastery across a wide swath of standards. They weren’t built to pinpoint specific knowledge gaps but to assess overall growth and compare student standings. To be blunt, they offered teachers a generalized number, with limited practical application beyond broad student groupings. I am simplifying this of course and there are nuances to be pulled from assessment data, but for our purposes, it was not micro enough.

Starting with this macro data, we presented teachers with these high-level groupings. But here’s the kicker: just two weeks into the school year, teachers’ intuitions often mirrored our data. We could even correlate our findings with real-time classroom engagement in 1–2 classroom observation days. The data wasn’t revealing new insights; it largely echoed what educators already sensed.

However, the data did provide a clearer snapshot of student standings than we had before, and teachers valued it. Our team then geared our standards/skills model to interpret this data, drawing basic inferences about student performance with the information we had. We might have lacked the minutiae at this stage, but we crafted a system map providing a personalized (if broad) insight into which skills required more attention.

But we needed more!

When we chatted with teachers about pinpointing students’ skill gaps, their insights were interesting. Their go-to methods included:

  • One-on-one assessments
  • Progress monitoring tools
  • Small group instruction time anecdotal data
  • Exit tickets
  • Homework, assignments, and in-class assessments
  • And, often overlooked but incredibly vital, their own intuition and observations from class participation
Photo by Kenny Eliason on Unsplash

Sure, some of this data might be housed in platforms we can tap into. But if it didn’t, asking a teacher to manually feed every exit ticket or homework result into another system? That’s more burden than benefit. It’s asking them to duplicate work when they’re already aware of the challenges at hand.

Yet, imagine the wealth of insights we could glean if we consistently accumulated this type of daily data for each student. The sheer volume of information across diverse student needs can be overwhelming for any educator. How can one person mentally sift through and act upon so much information over an entire school year?

This is where Strived.io aims to bridge the gap. Our vision is for teachers to effortlessly share their insights with our platform (we are working through the how!). By integrating these notes with satellite and map data, we believe we’re paving the way to a comprehensive view of each student’s progress on crucial skills.

What’s fascinating to me is that, as an industry, we might have been focusing on data and interoperability in places that don’t matter as much. The true treasure trove of data? It’s right there, in every classroom, just waiting to be uncovered.

I can’t say for sure we’ve nailed the solution just yet. I mean, some super-smart folks have taken a swing at this before, and with deeper pockets than ours. However, we’re confident that we’re on a promising path to discovery.

--

--

No responses yet