In this Blog post, Harris Kornstein reflects on the article “‘Under Her Eye:Digital Drag as Obfuscation and Countersurveillance,” which was published recently in Surveillance & Society.
In the fall of 2014, at precisely the same time as I was preparing my applications for PhD programs, I found myself under attack on Facebook. Like many other drag queens, my Facebook profile had been reported for allegedly using a fake name. And it turns out drag queens weren’t the only ones: what soon became the #MyNameIs campaign represented Native Americans and other communities of color, transgender people, sexual violence survivors,
healthcare providers, political dissidents, and others whose cultural naming conventions didn’t match Anglo-American standards, whose identities were not as immutable as Mark Zuckerberg believed, or who needed fine-tuned privacy options that the platform simply didn’t allow for. Our motley crew of Bay Area-based misfits protested at Facebook’s headquarters and eventually
received public apologies and some policy and procedural changes that afforded users slightly more autonomy in representing themselves. Still, as evidenced by the messages I receive on an almost-weekly basis, Facebook’s unfair policies continue to impact users to this day.
However, as a nascent media studies scholar, there was always something that felt not quite right about working so hard to keep people on a platform that was problematic for so many reasons (and which have only become clearer since then). At the same time I was working behind the scenes to restore people’s access to Facebook, I also wanted to think about how not always being visible to designers or developers — or perhaps even politicians or executives — might allow for queer, trans, and other marginalized folks to avoid some of the larger surveillance threats of platforms like Facebook. Though I hadn’t yet read work by scholars like Kara Keeling, Simone Browne, Finn Brunton and Helen Nissenbaum, and many others that would deeply shape my research, I suppose I had a hunch that by embracing queer aesthetics like messiness, camp, and hypervisibility, we might actually find we could better protect ourselves from digital harms, all while expressing our authentic identities outside of the pre-determined database categories.
I wanted to think about how not always being visible to designers or developers — or perhaps even politicians or executives — might allow for queer, trans, and other marginalized folks to avoid some of the larger surveillance threats of platforms like Facebook.
As I describe in the introduction to my article “Under Her Eye: Digital Drag as Obfuscation and Countersurveillance” in Surveillance & Society, being enmeshed not only in the world of local drag performance, but also being connected to diverse performers around the world, I began to take note of many of the ways drag queens inherently disrupted Facebook’s minimalist look and simplistic architecture. As I discuss in greater detail in the article, drag queens actually offered a set of tools for countering being tracked — or at least being held to account — through several ways beyond using non-legal names: particularly by making creative use of affordances like Facebook “likes,” relationships, status updates, and other personal information. (For a more colloquial how-to guide, I would recommend my alter-ego’s essay “A Drag Queen’s Guide to Protecting Your Privacy on Facebook by Breaking the Rules” published on Wired.) I am also quite interested in how drag makeup often confounds facial recognition algorithms by mistaking drag queens for one another, often with comedic effects in drag communities — a topic I am considering more in-depth through a forthcoming digital humanities photo project (see the above image for a teaser).
To sum up, “Under Her Eye” — part of a longer project on queer/trans techniques of countering surveillance capitalism — seeks to make a few interventions. First, I want to rethink assumptions about the veracity and authenticity of information, which theorists of privacy and obfuscation often take as a given, to acknowledge that there are often multiple truths, identities, or realities that are all valid at any given time. And to acknowledge that such multiplicities might be a strength rather than a vulnerability. Second, I want to expand the role of drag as a traditional performance and craft form in the twenty-first century, to consider its impacts beyond gender disruption and what it can tell us about identity and information in digital contexts. And finally, most broadly, I want to look to cultural practices that already exist to avoid, thwart, or otherwise mitigate the harms of surveillance — many of which are not intentionally oriented toward counter-surveillance or digital contexts, but provide helpful insights and tangible tactics nonetheless. That is, while much work by privacy activists, surveillance scholars, and technologists focuses on flashy tools or policy debates (which of course is important), by focusing on queer, trans, and other intersecting communities, I hope to take a page from Fred Moten’s poetry: “We’re already here, moving.”