How I approached AI-literacy in the writing classroom

Laura Dumin
7 min readMay 29, 2023

--

As I write this piece, I can’t help but think about the various ways that people might approach teaching students about AI. I’m sure there are a few ways to go about it, but here I want to focus on what has worked for me in my writing classrooms. I teach first-year composition, general technical writing, and specialty technical writing classes. I see my roles in these classes as somewhat varied regarding AI, but when it comes to the technical writing majors, my biggest goal was to be sure that they were aware of the AI tools that might be most useful to them as they graduated and moved into the workplace.

What is AI literacy

For starters, what do I mean by “AI literacy”? For me, this means teaching students both what AI can AND can’t do well. I see both sides of this as important because if we only focus on one half of the discussion, students may miss some very important points from the other side. I have also told students about how to use multiple AI programs to obtain the results that they want, much like app-smashing on an iPad.

Let me pause for a moment to answer a question that I have been asked more than once: “Do you think that you’re just training better cheaters?”

Nope. Sure don’t. I mean, the argument could be made, and I’m sure that there are students out there who will use the programs for nefarious, or least academically dishonest, purposes, but I also think that showing students the ways that the programs can be used for maximum benefit can help them determine where and when to use the AIs rather than turning to them for cheating purposes. I liken teaching about AI to teaching sex-ed. Students can easily get lots of information from the internet, but do we really want them to learn about AI programs from TikTok? Or would we rather teach them subject-specific tools that will be helpful to them in the long run? The results with students this last semester support my thoughts on this. I had no assignments that reeked of cheating or even appeared to be written with improper AI use. Talking about what the AIs could and couldn’t do did not, in fact, appear to increase cheating. Instead, students were more likely to put some thought into why they were using it.

So let’s dive a little deeper into what I did with my classes this last spring.

Teaching about AI

When spring semester 2023 started, I was as prepared as I could be for the coming predicted avalanche of AI assignments. I had revamped all my assignments to include AI in purposeful ways (more on that in another post) and I had used ChatGPT a few times. I had read the articles, talked to the people, gotten my IRB approval to study how my students used AI, and steeled myself for whatever might come.

That avalanche never came. But what did happen was purposeful discussions in class almost daily about the newest news on AI. We opened up ChatGPT and Elicit.org and asked it the same question about which was the faster animal while hunting. The question was meant to be silly to see what the programs could do with it. ChatGPT knew that unicorns were not real, as shown in Fig 1. So far so good.

Figure 1: ChatGPT response to animal hunting speeds

Then I asked it to provide sources, and this is where the problems began (Fig 2).

Figure 2: ChatGPT sources for animal hunting speeds

The National Geographic link is a real one, but the All About Birds page is fake. Both look real, though, showing that large language models (LLMs) are really good at writing convincing text because of their predictive modeling skills.

We then looked at this same question in Elicit.org and I was sad to see that the unicorn question wasn’t even addressed (Fig 3). The big difference between these two programs is that ChatGPT is a LLM and Elicit.org is a

Figure 3: Elicit.org response to references for animal hunting speeds

research bot meant to bring back valid results. The results here weren’t fully related to the topic but could be a good starting point for students who are stuck at the research and fact-finding stage.

By showing actual examples and then discussing the output, I hoped to train my students on what to look for in good output. This involved going back to the critical thinking skills that so many of us are working to teach our students. Using AI to find information still requires as much critical thinking as using Google, but the skills need to be tweaked so that students understand that they need to question the output rather than accept it at face value.

All semester, we had short demos like this, or quick discussions of what students had found to be helpful and not helpful. When SlidesGPT came out toward the end of the semester, we tried that one too and determined that the output might be great for K-7 or so, but for higher ed students, it was less helpful. I did, however, allow them to use it if they wanted, as long as they mentioned it in their reflection pieces. No one did, though.

Takeaways

I didn’t do anything majorly time-intensive during class. The biggest time-investment on my end was trying to keep up with the general trends and changes within the AI programs. Much of the literacy part in class involved simply coming back to the topic and discussing the latest things or how the tools might work. But, beyond that, I believe that we must 1) give students space to use AI and 2) give guidelines for acceptable use.

Talk about the AI in class

For anyone just getting started with integrating AI into their classrooms in some way, let me suggest starting here. Just talk about programs that students might be using. If ChatGPT is a program that might be useful for your students, try it out at home, and then bring it in for students to see where it works well and where it doesn’t. Then allow your students to also try it out and reflect on what worked or didn’t work.

Encourage students to use AI and reflect

Let me repeat that. Reflect. Reflection. Have students think about their task and how well the AI completed that task.

I don’t think that AI literacy can stand on its own as the sole method of introducing AI and helping students to find academically useful ways to use it. Students have to actually use the programs in guided ways and then reflect on how the programs worked. Through this step, they hopefully learn why just having the AI do their work for them isn’t their best choice.

Discuss why “knowledge” and writing are important

Students cheat for a variety of reasons. There is plenty of research on the reasons for that. So it becomes important to show students why knowledge or assignment completion by using their own work matters in your class. What do we gain when we struggle through a math problem? How are we made stronger or better by understanding the reasons behind the Civil War? And so forth.

Along with this, discuss why writing can be a good way to ponder through a problem. Discuss how learning can be deepened by taking time to think through something, write down ideas, and come back to those ideas again to see if they still make sense.

We also need to realize that not all students need to write though an idea to process it. Sometimes students learn better in other ways. So while we can extol the value of writing, we should also acknowledge that sometimes there are other and better ways to demonstrate knowledge. It’s ok to discuss this with students too and give them space to reflect on and respond to ideas about the importance of writing and knowledge creation.

Purposeful integration of AI into the classroom

Just like we give students guidelines on what research or citations look like in our field, we need to give students guidelines on when it makes sense to use AI and when it makes sense to leave the AI on the side. In the same way that none of us are going to become literate in another language by just watching movies in that language, we don’t create AI-literate students without giving them opportunities to play around with the different programs. We have to guide them and help them see the value of the AI programs and the value of their own work. And we have to show how these things can work together to help students have a better learning experience.

Conclusion

At the end of the day, there are assignments where AI-use might make sense, like brainstorming, drafting, or helping to explain a confusing concept. There are other places where it makes little sense to use AI, such as with personal reflections about the course content. By taking time to show students where and how AI can be used, and where it makes little sense to use it, we can help students become more critical users of the tools. And this benefits all of us, because we will have fewer instances of academic misconduct to worry about.

--

--

Laura Dumin

Professor, English & Tech Writing. Giving AI a whirl to see where it takes me. Also writing about motherhood & academic life. <https://ldumin157.com/>