Skinner’s Revenge: The Absurd Future of Artificial Intelligence in Schools

Bloolight
Age of Awareness
Published in
5 min readJan 28, 2020
Photo by Valentin Petkov on Unsplash

In some schools in China, students are sitting in classrooms with “brain wave trackers” strapped to their heads. The little devices are intended to monitor the learning taking place inside of those heads by, I suppose, feeding that information into software which can analyze what all the bumps and squiggles mean. At the same time, other Chinese students are sitting in classes while their every expression and movement is monitored by cameras. The footage is then analyzed by software to determine levels of engagement in lessons which, in theory, will help teachers adjust their lessons to fit the needs of their students. Every time I read stories about this sort of thing, I get a nasty crawling feeling on the back of my neck. I imagine that most of my fellow educators feel the same way.

As an American, it is easy to feel safe from this sort of Orwellian style of education. The Chinese government has a long history of iron-fisted attempts to engineer and control the behavior of its citizens, which makes it easy to picture bureaucrats in Beijing poring over the brain waves of their country’s youth. Americans usually take a dim view of government officials monitoring their every move, but something strange happens when the person on the other end of the camera is a corporate entity with a decent marketing department. Slap a friendly, female voice onto a piece of technology, give it a name, and Americans will gladly allow that device to wrap its tentacles around all of our personal data.

Perhaps if the Ministry of Love had a better marketing department, they wouldn’t have had to resort to the face-eating rats.

In the ed-tech world, artificial intelligence is currently being touted as the next big disruption to the way our kids are educated in school. While some people speculate that such technology could replace our perpetually overworked teachers with perpetually overworked robots, smart money is betting on much more modest impacts. Even smarter money is betting that artificially intelligent software in schools is nothing more than snake-oil salesmanship of the highest order.

The simple truth is that truly “intelligent” software applications do not currently exist. To actually be considered intelligent, software has to be able to collect information from a poorly structured, chaotic, and messy world. It must then be able to determine which pieces of information are important, accurate, and reliable before attempting to find some sort of pattern. Finally, the software will have to be able to respond to this data in a way that is effective for solving the problem at hand. The ultimate pipe-dream in education, at least among technology gurus, is to have software which can collect information from students in real time, analyze it, and then engineer custom lesson plans which will help that individual student learn more effectively.

Ed-tech companies would have you believe that such software exists, but the truth is that none of these programs are particularly intelligent or effective. At this moment, the only way for software to analyze student responses is for the student to input it through computerized assessments. Because software is unable to reliably measure the quality of open-ended responses, these assessments are almost always of the multiple-choice variety. Software can quickly and easily score an item when there is a clearly correct answer that will only take one form. When correct answers can take many different forms, such as in written text or essays, algorithms run into a brick wall. The nuances of spoken language, coupled with the tendency of students to use incomplete sentences, broken grammar, and wildly inconsistent punctuation makes it almost impossible for software to do any sort of accurate scoring.

Even when a test can be quickly and accurately scored by computer software, there is no magical way to conjure truly personalized learning experiences for students based on their performance. Software may be able to input a score from a test into an algorithm and deliver a pre-written lesson that attaches to that score, but this is far from kind of adaptive experience that ed-tech sales teams are busy selling. Human beings have to create the differentiated lessons that AI will deliver to students, and human beings will have to decide on what combination of correct and incorrect responses will trigger each lesson. By the time teachers are done putting in the work writing all this material and tagging it with the proper algorithm-friendly labels, they have actually put in far more effort than it would have taken to simply hand alternative assignments to students who are clearly struggling. The face-to-face type of differentiation, even in small doses, is far more effective and much less time consuming than trying to build a software-driven system to do the same thing.

The other way that artificial intelligence is supposed to revolutionize our classrooms is more in line with the infamous experiments going on in China. The idea is to use biometric data, collected from devices worn by students or from software that analyzes videos of children in class. This type of scheme dangles the possibility of collecting mountains of data in real time, without the teacher or students having to directly interact with the software. The idea here is that the data could be used to determine how engaged students are during lessons by looking at things like blood pressure, pulse, eye movement, and other theoretical indicators of attention. A teacher could then be given a report, either in real time or after the lesson, that shows where student attention may have waned and where they were keyed into what was happening.

The problem with this kind of approach is that collecting lots of data quickly is the easy part. We have all kinds of wireless sensors that we can festoon our kids with, and we also have the ability to put cameras on every student in a classroom in order to watch them as they work through lessons. What is not so easy or straightforward is the analysis of all this data. It does not matter how quickly a computer can process information if, as humans, we don’t actually know what the information is supposed to be showing us. We cannot read the minds of students. A real time measurement of brain waves or biometric data does not change this fundamental fact. For all of our wonderful technological toys, we humans have a remarkably crude and incomplete understanding of how learning actually occurs. Considering how far away we are from truly grasping how our own minds work, it should be no surprise that we are even further away from coding an artificial software version.

Make no mistake, the snake-oil flimflam of artificial intelligence is not going anywhere. Despite its inflated promises and poor returns, it serves two very important functions in the education business. First, it allows the tech industry to sell more hardware and software to schools. Second, and perhaps more importantly, it dangles the promise of decreasing the need for the most expensive part of any education system: teachers. The fact that it delivers a sad facsimile of teaching will not stop “intelligent” software from making further inroads into our schools and their budgets.

--

--

Bloolight
Age of Awareness

I am a National Board Certified physics teacher with 22 years of experience and lots of opinions about the world of education.