“Because my teacher said it was true.”

Elissa Levy
Educate.
Published in
9 min readJun 8, 2021

Structured skepticism in the everyday science classroom

Experiment with adding food coloring to hot and cold water
My 3rd period class added food coloring to hot and cold water

I love the observational experiment where you put a drop of food coloring into hot and cold water. I first encountered this lab in 2017 at a STEMteachersNYC+AMTA computational modeling in a physics workshop with Joshua Rutberg and Emily Pontius. Why does the dye spread out so much slower in cold water than it does in the hot water?

When I asked my students this past week, they worked through various possibilities. Interestingly, their first guess (across all my class periods) was that the hot water was less dense, and the fact that there was less “stuff” meant that the dye could flow more easily. I reminded the students that I’d filled both flasks to the same level with the same tap water, and we could weigh the flasks to confirm that they still had basically the same amount of water. So that theory was out.

Ultimately, a student in each class ventured that the molecules of water are moving faster in the hot water flask. I asked how they know that water is made up of molecules. Their answer: their teacher told them last year.

So I asked how their teacher knew that water was made up of molecules. They thought about it. Most likely, they said, someone had told their teacher about molecules, and that’s how their teacher knew. So I asked my students, “Is that a good enough answer?”

My students were surprised to learn that humans weren’t always convinced that matter was made up of molecules. Before we developed sophisticated technology, we couldn’t “see” lengths that small. Light microscopes cannot resolve images below the wavelengths of visible light (400–750nm). Visible light is 1000 times larger than an atom. But we were pretty convinced about molecular theory long before we could probe those lengthscales — and this food coloring experiment is an example of supporting evidence for molecular theory. The fact that hot water dissolves a substance faster than cold water does can be explained if the hot water molecules are moving more rapidly than the cold water molecules, which increases the frequency of collisions, which causes solute particles to move farther distances in a given amount of time. Indeed, as my students came to realize, the temperature is tantamount to molecular motion.

I have classroom conversations like this a lot. It’s important to ensure my students know how we know what we know, because only then will they be able to question what they read, hear, and see, to decide for themselves how the pieces fit together and what to do about it to fix the world.

For years I’ve struggled to structure these conversations and their resulting student work product. I couldn’t find a tool that did this already, so I collaborated with a bunch of colleagues to develop the Structured Skepticism Framework. In this article, I analyze the tools most commonly used today and then explain the Structured Skepticism framework to increase critical nuance in student thinking.

CER checklist from https://modelteaching.com/wp-content/uploads/2019/04/CER-Checklist.pdf
CER checklist from modelteaching.com

The most common protocol I’ve seen for student argumentation in science is the Claim-Evidence-Reasoning (CER) framework. With this framework, a teacher (or sometimes, a student) poses a research question. The student responds with a claim that answers the question; a set of evidence from direct observation, data, or text; and reasoning that interprets the evidence and shows how the conclusion can be drawn therefrom. What’s great about the CER tool is that it can be used for pretty much anything, from a lab, to text analysis, to a diagram interpretation, to an entire unit or even year of study. To me, the biggest downside of the CER is that it doesn’t interrogate the process: why are we asking this question in the first place? Who would care about it? Is the claim we’re making always true or sometimes true? What are the limitations of our evidence that might require us to narrow the scope of our claim? And how do we know our evidence is reliable?

Here’s an example: a student drops a golf ball from a height of 5m and uses video analysis to infer the acceleration while the ball is falling. The student calculates 9.6 m/s² as the acceleration due to gravity. The known value is 9.8 m/s², but the student got pretty darn close, and she can certainly talk about experimental error. But: how broad is the student’s claim? Can the student claim that she determined the acceleration due to gravity anywhere on earth? What about anywhere in the universe? And did she determine the acceleration due to gravity for just the day when she did the lab, or for all of time? Whenever we make a claim, we use common sense to decide how broadly it should apply. It’s both impossible and silly to verify the acceleration due to gravity in literally every place at literally every time. But what we can — and should — do is to keep in mind that the result of every experiment is a generalized claim, and we need to explain why we think the claim can be generalized.

CSQ template from Harvard’s Visible Thinking Project

Another approach is the Claim-Support-Question framework from Harvard’s Visible Thinking Project. In this framework, the E and the R from CER are combined to form the “Support” section. I personally prefer it this way, because it can be difficult to disentangle evidence from reasoning when the evidence comes from a text or other pre-analyzed source. And I really like that CSQ adds nuance by asking the student to provide a reasonable counterclaim and then argue against it. This framework begins to ask students to show that there are multiple approaches to a conversation, that no evidence is fully conclusive, and that they must use judgment and thoughtful skepticism. But it’s still missing the overarching skepticism and shades of gray that I really want my students to grapple with.

Back in 2018, my collaborator Tegan Morton and I expanded on the CSQ to create a CSQ+. Since then, I’ve helped co-develop a series of STEMteachersNYC workshops called Dimensions of Responsiveness in STEM. Every time we lead the workshop, our teacher-participants add new elements to the framework, which they want to include in their classroom. This Structured Skepticism Framework takes the best of their input over the past couple of years:

Structured Skepticism Framework, June 2021, Elissa Levy (in consultation with many colleagues)

This is both a tool for students to take notes as they engage in an investigation and also a proposed framework for their assignment submission. By answering each question, students are faced with the key elements of deep scientific thinking.

First, students lay out their line of inquiry. What is the question they’re trying to answer? Not only do they write down the question, but they also reflect on who decided it was worth their time to pursue the question and who might care about the asnwer. If the student doesn’t know why humanity might care about the answer, then there won’t be enough buy-in to result in meaninful learning.

Second, students describe their research path. Conducting an original experiment, analyzing someone else’s data, or reading the answer on Wikipedia can all be valid approaches. When we (the teacher) tell students how to do their investigation, we must be able to also tell them why it’s a good approach. Often the reason will be “it’s good practice” — like when we redo well-known experiments whose answers can be looked up (like the acceleration due to gravity), or when we read and dissect published papers. But sometimes the reason for the investigation is that sources contradict each other or the answers aren’t yet clear. Citizen Science projects are great examples of classroom investigations where there isn’t yet a clear answer. For my overall approach to deciding what type of source to use, see this article.

Next, students state their answers. At first, they may or may not have a prediction ready. They may or may not change their answer over time. They may or may not be surprised by the answer. The key here is to remind students that it’s not about what the answer is — it’s about where our path is headed and what we can do with the information we have. This is why the “answer” section is in the middle of the worksheet and not at the end.

Then, students grapple with the key limitations of their answers. What are some caveats and nuances? No conclusion holds for all time and space, and so we ask our students to think deeply about where their conclusions may not apply. This is traditionally where we have the “lab error” conversation. In a gravitational acceleration lab, the student drops their golf ball a few times and comes up with an average acceleration of 9.6 m/s². Perhaps their answers varied on average by 0.3 m/s², so they decided that was their error. They look up the “right” answer and it’s 9.8 m/s², so they come up with some reason that their answer is lower, probably conveniently using air resistance as an explanation because air resistance would slow the ball down, and 9.6 is less than 9.8.

But it’s about more than just lab error. In this section of the framework, students also explore the scope of their conclusions. Have they determined the acceleration due to gravity for all of planet earth? What would make us believe that the acceleration due to gravity is the same when you go to different places? What about outside our planet? Do we have evidence that objects accelerate at 9.8 m/s² due to gravity when they’re out in space? No claim can be made universally, and if we can get students thinking about the limits of their knowledge in experiments like this, then they will be more prepared to think about caveats and nuance when they read sensationalized claims in the news. Even “real” news tends to flatten the story for entertainment purposes, and it’s up to us to be skeptical.

In the last section, students reflect on where to go next. We need our students constantly primed to use their newfound knowledge to make the world better. We do science in order to improve the world, and if our students can’t see how a given investigation will achieve this, then we need to help them see it — or change what we’re teaching.

I also ask students to document the new questions they have now that they’ve gained this information. I don’t necessarily expect my students to carry out these additional investigations, but I do want them to wonder what would happen if…

It’s all about metacognition. Our students must think about what they’re learning in school, why someone is making them learn it, and how they’re going to apply the knowledge in the real world.

After we did the observational experiment with food coloring in water, my students started to ask: what would happen if we added another drop of dye? What would happen if we waited until both flasks were the same temperature and then added more dye? What would happen if we put food dye on an ice cube? And what about some applications of this experiment? When cooking, maybe we should add in the spices only after heating our food, so that it’ll mix faster and more completely. Maybe we should study whether polluted air spreads faster in hotter climates.

I know the lesson is really working when my students identify potential paradoxes: if “hot” really just means that the molecules are moving faster, then why do fans cool us down? Don’t fans just make the air molecules travel even faster than they were traveling already?

And my answer to the student who asks that question? You’re doing it right. Keep going.

A heartfelt thanks to my collaborators who’ve evolved this framework with me over the past 3 years: Tegan Morton, Allison Mayle, Leah Liberman, and Jenny Shen, and to MfA Teacher-as-Writer colleagues who made my writing better: Tom Anderson, Sharon Collins, and Martha St. Jean.

Photo by Jason Anderson on Unsplash

Subscribe to Insights from Educate for a midweek dose of professional learning and inspiration from authentic voices in education.

--

--

Elissa Levy
Educate.

I teach physics in Virginia and facilitate workshops nationally. I aim to engage.