Quantifying the Unquantifiable

How I Measured Success Without Data

I sat at my desk the final week of school attempting to fill out the data component of my SLO.* But I was struggling.

My SLO focused on measuring student growth in my Creative Writing classes according to data points on a rating scale. As I conferenced with students about where they rated on a scale of 1–4 for the selected writing standards, they often apologized about their ineptitude, to which I responded that the numbers didn’t dictate who they were, just where they were. I used the ratings at the end of each project to illustrate improvement, but again, students felt penalized by the numbers. So I stopped using the ratings, but this left me without traditional data.

Yet I had anecdotal data for my SLO. I had pages of notes from student conferences about writing, as well as notes on two quarterly grading conferences about their learning, and most importantly, their answers to the question: “What has been your biggest growth this semester?”

My intent with this question was to foster student awareness of improvement. When I reflected on their answers, however, I was surprised over and over — many students not citing how they learned the structure of a story or mastered dialogue or understood the use of enjambment in a poem, but rather, many spoke of improved confidence, vulnerability, and empowerment:

“ I do my best work when it’s collaborative. Listening to what other people say draws out my strength.”

“I learned to be comfortable with my voice. This is what my soul needs to write. Writing puts so much to peace for me.”

“I discovered a passion for poems. I dug deeper into my abilities as a poet.”

“I learned how to think. This — my writing — is me.”

Although I knew the value of each of these answers, none could be measured through a rating scale or a letter grade. This lack of measurement didn’t impact my or my students’ perception of their growth, but I knew that I needed to have stronger evidence for my data. So my conundrum remained: How could I quantify these unquantifiable gains?

Furthermore, I had gained valuable insights in my practice that were the driving force behind my focus on growth. Two years prior I graded writing on rubrics created by a team of colleagues and me. We aligned them with the Common Core and taught each assignment to the rubric. Student writing was efficient, but lacked innovation and voice.

In the past two years I committed to aligning my writing assessments with what I valued in student writing. Student growth and risk were, and continue to be, at the top of that list of writing values. For how can a student grow in writing if not taking risks? And why would a student take risks if there is a chance of punitive ramifications with grades?

When I created my list of writing values, I had no idea the impact it would have on my classroom: it guided me to no letter grades on writing, more frequent feedback on each assignment, peer feedback improvement, writing conferences as a primary means of instruction, and student-led grading conferences to determine growth — in essence, a completely new classroom experience.

Maja Wilson, author of Rethinking Rubrics, states: “Unless we begin experimenting in our classes with assessment in ways that honor our values about the complexities of the writing and responding process, we will never be in a position to call for [a] paradigm shift.” It was this transition to honoring my values that shifted everything in my classroom. And it was this experimentation that positioned me to better understand student voice. It was even this philosophical change that guided my instruction and assessment to its own paradigm shift, a shift that honors my truest instincts.

Yet there I sat. Reflecting on my students, my research, my classroom, my conversations with colleagues — all to realize I had achieved precisely what I set out to do: assessing growth and encouraging students to take risks in their writing. And even that wasn’t quantifiable.

So, I filled out the SLO the best I could. I attached the pages of notes from student conferences, and I offered an explanation of why I didn’t have rating skill analysis. I included my grading contracts for my students, and I referenced my blogs and reflections throughout the year. I know the rubric used to score the SLO won’t score my lack of data well, but I know that my learning in the past year was immeasurable.

I understand that my classroom will never fit well in a world of grades and data, but I also know that I can not compromise my fundamental beliefs: all students have a unique voice, and it is my job to foster their growth toward finding that voice. And that is more important than any letter, number, or rating.

*Student Learning Objective is a measurable, long-term academic goal informed by available data as part of Educator Effectiveness.

--

--