FaceStealer

Matt Pearson
7 min readJun 12, 2023
Image by Jack Holmes @motion.picturesque / https://www.motionpicturesque.com/

So the client’s idea was to peel off users faces and reattach their digital skin to fashion models.

This act would be performed (consensually) in a booth with hidden cameras. We would photograph you from a few angles and construct a 3D model of your face. This could then be transposed onto the pre-posed, perfectly proportioned body of a model, gently cavorting on an animated billboard. You, the user, would be so delighted by this magic trick you’d spend a load of money in the client’s stores.

If we could also store the face in a database, linked to its owner’s contact details, we could email you further manipulations of your face at a later date. For continued delight. This database could become a literal face book, beyond the dreams of both Zuckerberg and Ed Gein.

I’m sure I don’t need to spell out the problems with this. Your face is perhaps your most personal possession. Not only can it can unlock your iPhone, it is your inbuilt authentication system for all social interactions. It is the mouth which speaks your words and the eyes that betray your feelings. Your face is not to be given away cheaply.

Your face has value. VISA have been trialling face recognition for “signing” payments. Banks have been exploring similar with cash points, seeing the face as a less steal-able identifier than a PIN. We’re also well aware of how deep fakery can use your face to bolster the crowd at a Trump rally. There are many applications. The harder job, perhaps, is trying to think up any good uses for a database of faces owned by a retail client.

https://www.bbc.co.uk/news/technology-63106024

I confess I was drawn in by the technical challenge though. I had an idea that, if we could instead build something that ran on the user’s phones, they could at least “own” their face data. Then the permissions would have to be more explicit. So I prototyped a system that ran in a web browser. And it worked. It stole my face.

FaceStealer, working prototype

The Price Of Magic

In creative tech we often like to showcase our edgy possibilities with a little theatrical deception.

If we build an app that responds to someone waving at it, we will not draw attention to the camera required to detect that movement. For to do so would be to detract from the delight of the interaction. To ruin the magic. “Pay no attention to the man behind the curtain” is one of the implicit laws of interactive installations.

The “experiential” space often feels like it gets a free pass on these things. Interactive installations can feel very close to the realm of “The Arts”, which is an appropriate place to ask questions of tech and its human consequences. But when tech art experiments bleed over into commercial projects sometimes the line gets blurred.

Context is important. In most scenarios, hidden cameras, microphones, and other sensors are something that are, rightly, seen as deeply creepy. Many experiential works live in public spaces, and we have become much more acceptant of being tracked in public. Whether by security cameras or inadvertently appearing in the background of tourist photos.

But if a supermarket, also a public space, were to track us in the same way as a gallery installation (… or a train station, or a nightclub, or other places for which it might be easier to argue for beneficial applications of face or body tracking), it would likely have a damaging effect on those businesses. Customers would go elsewhere. If a business were to do it secretly, and be later exposed, it would be even more damaging.

Yet in the field of interactive tech, it is accepted. There is an unwritten complicity that being surveilled by cameras, microphones, sensors, is the price we pay for this cool interactivity/experience.

A question often needs to be asked as to whether reward outweighs the intrusion. But, equally often, this question is skipped, because most interactives have short life spans. They are usually for events or campaigns, so ethical concerns get ignored because of their ephemerality. We don’t need to consider how a zeitgeist might be changing, or how it might be seen 6, 12 or 18 months from now.

The irony here is that many of our future tech experiments might be unusually backward looking.

The Uncle Steve Test

I find a good rule of thumb, for anyone having difficulty weighing up the ethical acceptability of any new tech, is to personify it.

Think of Amazon’s Alexa, for example. Alexa is one obvious example of a commercial concern pushing the limits of what is an acceptable level of surveillance. And their ambition is to invade a much more private space — the home.

So, in evaluating Alexa’s ethical acceptability, let’s first try replacing the term “Alexa” with “a team of data scientists”. Rather than ask if it’s okay for Alexa to have an always-on mic in your kitchen, so it can respond to your vocalised desire for eggs, you could instead ask if it’s okay for a team of data scientists to listen in on this conversation.

A problem with this method is you might actually imagine “a team of data scientists” would be rather more considerate towards your data privacy than a retailing mega-corp. So perhaps we need to take it take it a step further and find a more representative model.

Think of Amazon as your wayward Uncle Steve. Steve is the uncle the family doesn’t really talk about. He’s spent a little time inside, for some vague indiscretions. But now he’s out, and is looking for a new job. One of the services he’s offering is to surreptitiously listen to your conversations. Or watch you through a discrete camera. On behalf of the corporation. His service will allow us to say “Steve, we need eggs”, without us having to get our phone out.

Amazon (and others) have a million great ideas like this. With retrospect, it’s hard not to laugh at the reviews of their Echo Look product from a few years back — essentially a bedroom camera. Uncle Steve was offering to watch you dress so he could give you fashion tips. The Echo Spot was another step further still. Here Steve, with mike, camera, and an internet connection, would be sat by your bedside watching you sleep.

https://www.pcmag.com/reviews/amazon-echo-look

You are No Longer The Product

The good news is Alexa is tanking. The robot butler who likes to watch you sleep, racked up a $10billion loss in 2022. And elsewhere the “you are the product” business models, that have defined the big tech landscape of the last decade, have been encountering popular resistance.

Facebook’s Metaverse gamble has failed partly for this reason. The Quest 2, with its multiple cameras pointed at you and the room around you, sold poorly, possibly because people were resistant to buying a device covered in cameras from a company built on harvesting and selling personal data.

It’s been very reassuring to see a fight back to the Surveillance Capitalism problem. A few years ago I (drunkenly) argued in John Higgs’s book The Future Starts Here that we may have been radically overestimating the value of the data mountains everyone was so keen to accumulate. Because the networks were not capturing “real” human data, they were only recording the faces we, sometimes reluctantly, put on for the cameras.

And, of course, there is a fatal flaw with a business model that relies upon one’s users not really understanding that business model. What happens when they do? This is what we’re starting to see now, as the general public (especially the young) are much more savvy to ambient surveillance, and are starting to feel less happy about being anyone’s product.

We never got to have the privacy discussion with that retail client. My second piece of good news is that the “FaceStealer” project didn’t come together, so never made it out of prototype. Which is why now I can look back on it and laugh. I’d like to think I’d have talked them round, and we’d have found some acceptable compromise. But I might be kidding myself.

Because this is all we can do. No one is regulating this type of commercial innovation, because this would mean impeding progress. And god forbid we impede progress. That’s the anti-growth coalition talking.

The only people with a good enough understanding of what we can do vs what we should do, are us — the designers, developers and managers — building this shit. We are the only barrier.

So this tale is for you, my skunkworks colleagues. I hope the story of The FaceStealer will raise a few hairs on the back of the neck, and give the little shudder of a missed bullet.

--

--