Big Teacher is Watching Your Kids
“Schools should be safe places for students to learn, not spaces where they are constantly surveilled.”
Across the country, students, parents, and children’s privacy advocates are raising questions around the use of facial recognition and surveillance technology in elementary, middle, and high schools, as well as a growing number of college campuses.
The Issue
In response to mass shootings on campus, schools began building a massive digital student surveillance infrastructure, often with little regard for either its effectiveness or its impact on students’ civil liberties and privacy.
Deep Dive
“We’ve gotten no answers to all these questions: Under what conditions can a kid’s face be put into the system? Does the district need parental consent? Who can do a facial-recognition search? [Washingon Post]
Innocence Denied: Elementary Black Boys Under Surveillance and Inequitably Disciplined [Divers Education]
“Schools are operating as testbeds for mass surveillance with no evidence, and no way to opt-out.” [EFF]
“This week my daughter’s school became the first in the nation to pilot facial-recognition software. The technology’s potential is chilling.” [NYT]
“Why do we need surveillance technology to promote play-based learning and wellness? …sensors and surveillance technology…can bring in a third-party commercial gaze to protected learning spaces.” [Boston Globe]
“Schools should be safe places for students to learn, not spaces where they are constantly surveilled. The faces of our children, parents, and teachers should not be continually scanned and uploaded to a database…” [NY ACLU]
“All of my kids were ‘B’ students, now they’re failing in most classes. That’s how bad it is,” said Ms. Perez, whose children are in 7th, 8th, and 10th grades. “What does that do to a kid’s mental health & their willingness to even try?” [WSJ]
“Phone sensors & campuswide WiFi networks are empowering colleges across the United States to track hundreds of thousands of students more precisely than ever before.” [Washington Post]
“Gen Z is the most intensively tracked generation at school. Millions now attend schools where online learning tools monitor their progress on basic math and reading skills alongside their daily social interactions.” [Technology Review]
“Microsoft is calling for increased government oversight on companies developing facial recognition technology, at a time when public concern about the relationship between tech and the public sector is growing.” [GeekWire]
“The real danger is that records are not being deleted. The concern is that it can become a permanent record and that it can be sold to third-party vendors.” [Campus Safety Magazine]
“Schools are supposed to be open environments, and they’re supposed to feel welcoming. Short of creating these mini prisons, that’s just not the right environment that people are trying to create in education.” [CNET]
“In schools, facial recognition technology will necessarily mean Black and brown students, who are already more likely to be punished for perceived misbehavior, are more commonly misidentified, reinforcing the criminalization of Black and brown people.” [New York Civil Liberties Union]
“…the concerns about facial recognition — namely privacy, accuracy, and racial bias — are even more worrisome when it comes to children.” [NYT]
“Based on instructor settings, the software may take screenshots of a student’s desktop, detect the number of computer monitors connected to a student’s computer, or record a student’s web traffic.” [Teen VOGUE]
“It embodies a very cynical view of education, that it’s something we need to enforce on students, almost against their will.” [Campus Reform]
“To the right of Charles’s picture was his behavior log. He had made faces during the Pledge of Allegiance at 8 a.m. (-1), but shared his sandwich with Kaylen at noon (+2). He hit Lucinda at 3 p.m. (-5), but showed empathy toward Pedro, who had stained his favorite dinosaur T-shirt during lunch (+4). Charles ended the day at zero.” [NYT]
“All this information, once compiled, could be exposed through data breaches, sent to child data brokers or misclassified, which could lead to outing students or wrongly identifying innocent students as threats.” [NYT]
“The embrace of such tools by parents and K-12 administrators alike has led to a fresh boom in the school safety technology market, with a handful of established companies and a growing crop of startups now competing to offer ever-more comprehensive surveillance capabilities.” [ED Week]
“Parents ought to read up on student monitoring in their districts, ask questions and, if necessary, speak out. When it comes to student privacy, parents may be complacent, but the tech companies sure aren’t.” [NYT]
“A lot of this software has more information on students than some banks do.” [WIRED]
“In fact, only about a quarter of incidents were reported to district officials on school days between 8 a.m and 4 p.m., bringing into sharp relief how the service extends schools’ authority far beyond their traditional powers to regulate student speech and behavior, including at home.” [The 74]
Students also are balking at allowing third-party software access to their devices, with some services requiring that students give them permission to read their computer files, monitor their keystrokes, and analyze their biometrics. [REUTERS]
“Facial recognition is not a requirement for those devices or even involved in the process of taking temperatures. But the feature has emerged as a powerful way to market the devices.” [WIRED]
“According to many of those using them, exam surveillance tools are displaying bias against POC, transgender students, those with physical and neurological disabilities, and also those in low-income or rural households.” [thred.]
“Leaked software from Proctortrack raises questions about how carefully schools evaluate test-monitoring companies.” [Consumer Reports]
“The College Board is tracking students and sending information about their activity to advertising platforms at companies such as Facebook and Google.” [Consumer Reports]
“It makes no sense to bring this aggressive surveillance technology into our schools when no one has made a compelling case, either that it will meaningfully improve security or that it can be used without violating the privacy and civil rights of students, staff, and visitors.” [New York State]
“Under FERPA, educational institutions cannot legally share information that identifies a student — but both federal and state laws are vague about that definition. The proposed bill would limit the ways vendors could use, share, sell or rent student data.” [WGBH]
“NBC News reports its analysis of school files posted on hackers’ websites found files containing children’s personal information, including medical conditions and identifying data like Social Security numbers and birthdates.” [NBC]
Get Involved
There are several organizations that are leading the effort to ask questions from vendors and/or working to create laws that would protect students on campus from biometric data collection, including the use of facial recognition and AI in schools.
Electronic Freedom Foundation (EFF): About Face
Fight For the Future: Stop Facial Recognition on Campus
Senator Gillibrand’s Data Protection Act would create a new independent agency called the Data Protection Agency (DPA), tasked with protecting consumer data at large.
Ask Questions
If your school is already using surveillance technology or is in the process of adopting it, you need to ask questions about student civil liberties and privacy. Here are some questions to get started.
Under what conditions can a child’s biometric information be put into the vendor's database? Can parents opt-out of data collection? Who owns the data collected? How long is the data stored? Can I ask to see what data has been collected? Is this data being shared with data brokers? Police? Marketers?