Trusting “somewhat” is not enough: why we need to regulate face recognition

Say you’re in the market for a new doctor. You ask a friend if they trust their own doctor, to which your friend replies that they trust them “somewhat.” Would you go see that doctor?

Didn’t think so. Nonetheless, the Pew Research Center reports that the majority of Americans “trust” law enforcement to use face recognition technology responsibly. Looking more closely at the study, however, it’s clear that only 17% of people surveyed trust law enforcement “a great deal” with face recognition. The rest either don’t trust them at all, don’t trust them “too much,” or only trust them “somewhat.”

Source: Pew Research Center

You wouldn’t go to a doctor who people only “somewhat” trusted. So when people say they trust law enforcement “somewhat” with face recognition, how much confidence does this really indicate? As it turns out, the public is much more skeptical than Pew’s headline and major media headlines would have you believe. In fact, as the public has become more aware over time about how face recognition is used, general opposition has increased.

This skepticism is even stronger among people on whom face recognition technology has been shown to perform worse: people of color, young people, and women. They tend to be less trusting of law enforcement’s use of face recognition than are white people, older people, and men. People of color are also more likely to come into contact with law enforcement, meaning the people on whom this technology is used the most are those on whom it performs the worst. Not surprisingly, they’re the ones who trust it the least.

Source: Pew Research Center

As it stands, law enforcement is not always deserving of the public’s trust. As the Center reported back in May, the New York Police Department uses celebrity look-alikes, heavily edited photos, and artist sketches of suspects in its face recognition system to find and arrest people. This essentially boils down to fabrication of evidence. And they’re able to get away with it because there are no rules around how agencies can use face recognition.

But regardless of how the public feels, rules are still necessary to ensure that law enforcement lives up to the public’s level of trust. These are not incompatible. People generally trust doctors with their lives, but the medical and healthcare fields are still tightly regulated to ensure proper practice. Similarly, laws that govern how police use — or don’t use — face recognition are absolutely critical, even despite the public’s trust. Regulations ensure that the public’s trust is not just assumed, but earned.

Until such regulations have been thoroughly, democratically, and publicly discussed, law enforcement should not be using face recognition technology.

Jameson Spivack is a policy associate with the Center on Privacy & Technology, and can be found on twitter at @spivackjameson.

--

--

Jameson Spivack
Center on Privacy & Technology at Georgetown Law

Associate, Center on Privacy & Technology at Georgetown Law. Focusing on the policy and ethics of AI and emerging technologies. Hoya + Terp.