Navigating ChatGPT

Jason Hogan
UPEI TLC
Published in
7 min readAug 28, 2023

You’re out on a walk when you come to two doors, each door guarded by a brother. They inform you that one brother will always tell the truth, and the other will always lie. Behind one door is your path forward, the other leads to certain death (oooooo~)

A still from the movie Labyrinth that presents this puzzle

You know the solution and ask one guard whether their brother would say the door guarded by the brother you’re asking would lead to the castle. The brother says yes, you take the other door and carry on. Now you come to another set of doors, but only one guard this time. The guard will reply confidently but even he won’t know if he’s telling the truth or lying. This third guard is ChatGPT

What’s the deal with ChatGPT?

There’s been many words written about ChatGPT and while I don’t think what I will add to the conversation is new or novel, I think it’s good practice to think through a new development and put those thoughts to paper or pixel.

For those who haven’t been keeping up with topic ChatGPT is a free Open AI chatbot. The bot has a few explicit rules such as not knowing anything post-2021 and being stringent about trying not to opine on political topics (a wise choice given the history of AI chatbots of the past like Microsoft’s unfortunate foray into the space with 2016’s Tay).

With the release of ChatGPT, there’s been a flurry of discussions about the impact of AI on teaching and assessment. A new tool that can analyze questions (even breaking some of ChatGPTs explicit parameters if the questions are carefully crafted) and respond with an answer.

At this point it is worth playing with the tool and when you first use it, the presentation of answers forming before your eyes can be quite impressive. However when you take a look at the product, especially for higher-order thinking questions such as compare and contract, the question gives us a better opportunity to examine its capabilities. If you’ve gone through this exercise yourself feel free to skip ahead, but if you’re new to the tool it is certainly worth engaging with the output itself to demystify it.

Let’s go through an example

I asked ChatGPT to compare and contrast the Halligan bar and the Kelly tool, these being two firefighting forcible entry tools. Here’s the output:

A Halligan bar and a Kelly bar are both hand tools used in emergency situations, primarily by firefighters.

A Halligan bar is a multi-functional tool that typically features a forked end, a flat end, and a pointed end. It can be used for prying, smashing, and piercing. The Halligan bar is a versatile tool that is often used in conjunction with a sledgehammer.

A Kelly bar, on the other hand, is a longer bar with a curved blade at one end and a pointed end at the other. It is used for forcing entry into buildings and for breaking through obstacles. The Kelly bar is a specialized tool that is designed for heavy-duty tasks, such as breaking through reinforced concrete walls.

In summary, the Halligan bar is a general-purpose tool used for a variety of tasks, while the Kelly bar is a specialized tool designed for heavy-duty tasks.

The tool does certainly recognize a Halligan bar as a forcible entry tool with a forked end, a spike, and its wedge-like adz.

A picture of a Halligan bar.
The Halligan bar.

Using its knowledge base ChatGPT is even able to identify that the Halligan bar does get paired with a striking tool. Here’s where ChatGPT starts to stand out by suggesting a sledgehammer. It’s certainly an option but the classic pairing with a Halligan bar is a flat head axe, a pairing so established they have their own special title: the Irons. The absence of the flat head axe here isn’t a mistake, but the absence stands out to anyone familiar with the tool. It is sort of the answer you would expect from someone who understands the purpose of a Halligan bar but clearly skipped the lecture on the topic.

While ChatGPT picked up on the Halligan bar, it’s less familiar with the Kelly Tool. In this prompt ChatGPT recommended it as a concrete breaking tool, though previous iterations of me asking these questions have had ChatGPT inconsistently recommend it as a roof-cutting tool, or a tool for pulling locks from doors. While the Kelly Tool is used less frequently than then Halligan bar, comparing and contrasting the two tools should be a bit more straightforward as this is the Kelly Tool:

the Kelly Tool

The Halligan bar is the Kelly tool with a spike. Their design and use are the same, but the Halligan’s spike gives it a few advantages over its precursor. (Also if anyone is going through reinforced concrete walls with the Kelly Tool they didn’t need it in the first place).

While it is a bit of a “gotcha” question, I think it demonstrates well the difference between our intelligence and the current level of artificial intelligence. I think the question hangs up in that it’s being asked to compare and contrast two tools used for similar jobs, where ChatGPT predicts it will contrast tools that do other jobs that need to be tackled in fire emergency situations where the Halligan isn’t going to work like forcing reinforced concrete or opening a roof, rather than looking at the tools and seeing the similarities and differences.

This post has had a lot of firefighting in it which I hope has been useful as an area where most of the blog readers here aren’t very familiar with the topic, but it’s an area where I have enough of a grasp to have ChatGPT’s mistakes jump out at me. In the classroom setting for a course you’re teaching, this role will be reversed. You might have students who see the ChatGPT output and be impressed, whereas you, as an expert, may find some strange and bizarre mistakes jumping out at you. It’s not a guarantee that there will be mistakes, it’s just a risk that students who consider the tool will have to weigh.

ChatGPT is a bad cheating tool

The more I play around with ChatGPT the more I’ve come to the conclusion that it is not a good tool for cheating. Especially right now when so many people are experimenting with ChatGPT I had to try to login several times just to access the chat option so even access was more of a barrier than someone looking for the easy-way-out might put up with. The unreliable output of the tool makes a copy-and-paste job a gamble. And who wants to fact-check the answer if your goal was to cheat? Any context where someone has the capacity to cheat, they’re going to have access to better ways to cheat.

In the event of a step-by-step paper where you are worried that a student might submit a ChatGPT draft and have you identify the errors for them, you can always turn the work onto the students identifying that the paper contains errors and inviting to discuss them to understand whether those were inventions or misconceptions or a discussion about credible sources.

ChatGPT in the classroom

I’m certain that ChatGPT might find some legitimate uses in the classroom, perhaps primarily as a case topic in a marketing class, or an ethical discussion about what academic integrity means. I could see it as an exemplar builder for students suffering an initial writer’s block and unsure of how to format, but the tool producing false information means it wouldn’t be a strong tool for outlining or first drafting.

As far as hyped edtech I think ChatGPT will have a bit more footing in the classroom than blockchain technologies which were hyped up a few years prior but are largely collapsing with the ever growing pile of crypto failures. For discussions about AI in the classroom I think there will be continued uptake of tools that leverage the strengths of AI, tools like the AI grammar checkers which seem to become more ubiquitous without drawing as nearly as much concern.

Pedagogy and ChatGPT

What are some avenues to respond if ChatGPT is a stronger tool than I give it credit or it gets smarter? I think here it’s worth looking at the ground that has been tread already. While ChatGPT is a tool that can respond to a written prompt and give an answer, the field of mathematics has a tool that can reliably solve very difficult problems very quickly, the calculator.

ChatGPT pales in comparison to a calculator which will more reliably give you a correct answer. In math we’ve seen the importance of “showing your work”, not just as a tool make sure you weren’t solely relying on a calculator to do all the work for you, but to give the instructor an avenue for feedback on your process. The process of students showing their work for written work isn’t just 1st drafts, but can include discussions about what sources they consulted, what they discarded, and why. I believe that in the age of answers being at our finger tips, especially in an era of fake news and disinformation we’ve grown from “what is the answer” to “what is the answer and how did you get it” much more than we give ourselves credit.

ChatGPT isn’t a revolutionary tool, but it has been levied as another avenue of eroding teacher/student trust, a pain point that can be agitated to market solutions and tools. While we might want quick fixes that flag things for us, many of these issues come back to re-centring and advocating for pedagogical solutions and advocating that the institutions where we work, teach, and learn foster the room we need to engage in those pedagogies.

--

--