AI and Surveillance in Higher Education

Lizzie Hughes
surveillance and society
2 min readNov 30, 2023

In this post, Mark Swartz and Kelly McElroy reflect on their piece ‘The “Academicon”: AI and Surveillance in Higher Education’, which appeared in the 21(3) issue of Surveillance & Society.

Image from Unsplash.

As members of the Library Freedom Project, Mark and I have both thought a lot about issues of privacy and data security in libraries and universities. When Surveillance & Society called for papers for a special issue focused on Artificial Intelligence, we used our article as an opportunity to dig into the ways AI is being used — intentionally or not — as a tool of surveillance on university and college campuses.

In particular, we wanted to think about the largely hidden effects that AI tools have on university students, particularly through educational software and other technologies they are required to use as part of their studies. From test proctoring to predictive advising, many for-profit educational technologies incorporate machine learning, seeking to use large bodies of data to more efficiently monitor and manage university activities.

We chose to use narrative fiction to explore the life of Maria, a fictional student navigating these tools, in the hopes of highlighting that whatever advances technologies bring, they are paired with a direct negative impact on the people who are forced to use them. In the case of tools that collect personal data, there can be grave effects on personal privacy and autonomy.

One perhaps glaring omission from our piece is generative AI tools like ChatGPT. In the time since we wrote our piece, these tools have rapidly become the main topic for the discourse around AI on university and college campuses. However, the concern dominating this discourse has been how they may be used by students, perpetuating moral panics over plagiarism, rather than in terms of how they may appropriate student data or be used in other ways. In fact, the invasive plagiarism-detective tools we explored in our original piece are now joined by faulty AI-detection tools.

We would simply reiterate our recommendations from the piece here: universities must weigh benefits and negative effects — before rashly banning or leaping into adopting these tools, they must be evaluated in a process that includes opportunities for shared governance and personal autonomy.

--

--

Lizzie Hughes
surveillance and society

Associate Member Representative, Surveillance Studies Network