Writing Tools Have a Place — Beside a Teacher, Not in Place of One

Janet Neyer
Ahead of the Code
Published in
5 min readAug 19, 2020

Self-selected piece: Telling Our Stories: Creating Authentic Narratives of Home (an article published in the Language Arts Journal of Michigan, 2014)Media: print (no hyperlinks)

Mode: professional journal article in response to a call for proposals

Audience: language arts educators and teacher educators

Purpose: to address the issue’s theme on place-based writing

Situation: I had done a demonstration of this lesson at the CRWP Summer Institute 2013. When the call for manuscripts was released, one of the institute co-directors contacted me to suggest I write a piece to go with the lesson. This piece underwent significant revision after feedback from a college professor and former editor for the LAJM and then from the two current editors of the publication.

AnalyzeMyWriting

I often suggest students enter their writing on analyzemywriting.com. This is a free tool that offers some interesting analytical points and even some graphics and visuals about written text. I find this raw data about writing pretty fascinating, but I’m not always sure that my students know what to make of it. For instance, the readability of my published article suggests that I am writing at a ninth-grade level.

My readability score according to analyzemywriting.com.

Readability algorithms measure sentence length and word length with the premise that less mature writers use shorter sentences and words.

This algorithm, though, is flawed, especially when analyzing narrative text. Embedded within my published article is a narrative that I wrote for my students. This narrative includes dialogue and single word sentences for effect. Algorithms, like those for readability, aren’t capable of recognizing stylistic choices, and especially aren’t able to recognize rhetorical choices for particular audiences. In addition to data about readability, AnalyzeMyWriting also showed me what my sentence lengths, lexical density, and most frequently-used words were.

Some raw data about the lengths of sentences in my article.

AnalyzeMyWriting gives writers raw data. As a writer who appreciates words, syntax, and rhetoric, I love that data. It gives me some next steps as far as examining overused words or phrases, determining readability, and maintaining a consistent style. It might also help me to increase the complexity of a piece or simplify its readability for particular audiences. I am not sure, though, if this raw data would have as much meaning for my high school students. They might see the readability score and revise with a thesaurus open in another tab — a strategy that rarely improves student writing! Still, I think the raw data gives us some conversation points in the classroom, some opportunities to ask questions about the writing and to identify priority moves for writers to make in the revision process. How to revise, though, would be left largely to the student as this tool does not make suggestions. I like that, as it leaves the door open for meaningful teaching through conferencing, mentor texts, feedback, and peer review.

Writable

Submitting my article to analyzemywriting.com left me wanting more. The data was intriguing, but I was curious to see what a tool with a built-in revision aid could do. I built a mock assignment in Writable to try out the Revision Aid tool in the way that a student is likely to encounter it. This writing tool was far more sophisticated than the assistance I received from AnalyzeMyWriting. To use a sports analogy, I’d describe AnalyzeMyWriting as the coach who tells a player her stats during the season as a way to improve while the Revision Aid in Writable was more like the coach who studies every element of play and tells a player what is most important to address right away.

The Revision Aid analyzed each paragraph of my article and assigned it a red, green, or yellow star. Clicking on the star gave me a list of what I did well in that paragraph and what I still need to work on. I imagine that this type of feedback could be very helpful for writers, especially with writing that fits a clear genre and is measured against a clear rubric.

I appreciated that the Revision Aid did not overwrite my work. It did not make suggestions like I have seen Grammarly or word processing grammar tools do. In that way, the “how” of revision is still in the hands of the writer. And, as with AnalyzeMyWriting, I see a lot of room for a good writing teacher with this tool. Writers receive concrete, targeted areas for growth which, in turn, provide opportunities for teachers to model, pull mentor texts, offer feedback, and engage students in peer review.

This paragraph earned a green star, but the Revision Aid still had suggestions for me.

Writable’s analysis of my article was complicated by the fact that the article does not neatly fit a single genre. Yes, it is an argument about the importance of engaging students in place-based writing; however, it makes that argument in part by offering a narrative in the middle. That confused the Revision Aid tool. The changes in style and the addition of narrative features were flagged with red stars as needing priority attention. AnalyzeMyWriting had a similar problem with the algorithm when it hit the narrative portions as well. Overall, though, it recognized (and rewarded me with green stars) for the elements of argument within the paper.

My exploration of these tools has helped me to understand their place in the classroom — beside a teacher, not in place of one. Each tool gives writers and teachers valuable data, but developing writers will need an interpreter to understand what that data means as well as a coach to identify moves for improving, to model those, and to connect writers with peers who have mastered those moves. In addition, in thinking about my own role as a writing teacher, I imagine needing to really talk with my students about how algorithms work and what their potentials and limits are. Had I used only a writing assistance tool at the time I worked on this manuscript, I would have received a lot of red flags on the narrative portion, which, honestly, is my favorite part of the article. I will want to make sure that my developing writers feel confident enough to defend the choices they are making even in the face of the algorithms.

--

--

Janet Neyer
Ahead of the Code

Teacher at Cadillac High School in Michigan. Leadership team member and teacher-consultant of the Chippewa River Writing Project.