Loading…
0:00
10:02

Smile, Your Face Is Now in a Database

How airports and DHS are using facial recognition on travelers

Photo by Joe Raedle/Getty Images

A crucial step has been added to international travel. If you aren’t a U.S. citizen, you might need more than just your documents in these 10 airports: Hartsfield-Jackson in Atlanta, Logan in Boston, O’Hare in Chicago, Hobby and George Bush in Houston, McCarran in Las Vegas, Miami International, John F. Kennedy in New York, and Dulles International Airport in Washington, D.C. As part of Trump’s executive order banning certain Muslims from entering the United States, the Department of Homeland Security (DHS)—specifically U.S. Customs and Border Protection (CBP)—has ramped up and expanded its Biometric Exit program, which gathers biometric data from international individuals exiting the United States to document and verify that they’ve left. But instead of the fingerprints or document review DHS had been using as biometrics verifiers since 2004, the focus now is on facial recognition.

John Wagner, deputy assistant commissioner at CBP, outlined the program vision in early May. It was initially trialed in June 2016 at Hartsfield-Jackson Atlanta International Airport. “We’re going to build this for [Biometric] Exit. We’re out of time; we have to,” Wagner told the audience at the Connect ID conference in Washington, D.C., referring to further implementation of facial recognition technology. “But why not make this available to everyone? Why not look to drive the innovation across the entire airport experience?”

Wagner is arguing here for the expansion of facial recognition technology not just to the flight check-in process, but also to airport waiting areas, external cameras, and other aspects of airport travel.

Wagner went on to say that this could mean using facial recognition to identify travelers arriving in the United States, including passport-holding citizens, regardless of their citizenship status. It could also be applied to TSA checkpoints or airport lounge access, according to the Verge. “As soon as you check in for arrivals or departure, we’re going to stage your photo in that database,” Wagner said. “We want to make it available for every transaction in the airport where you have to show an ID today.”

In practice, the process involves taking a picture of your face, and then cross-checking it against facial templates DHS pulls from visa and passport images already on file. If you’re a U.S. citizen, DHS says it will delete your images on file within two weeks. If you’re not, DHS will keep these images for 15 years.

The agency hopes to implement this program in the 20 highest-traffic airports by the end of next year. But while DHS and CBP see this as a way to help address national security and immigration, others are concerned about the civil rights and surveillance implications of the continued rollout of such a system. “They’re using facial recognition on American citizens without any explicit authorization,” said Harrison Rudolph, an associate at the Georgetown Center for Privacy and Technology who has been studying the program. “That’s a big deal.”

Congress has been clear that the Biometric Exit program is about collecting biometrics from foreign nationals but never giving DHS permission collect biometrics from U.S. citizens—but that’s exactly what DHS is doing, according to Rudolph. The process began in 2016, and DHS hasn’t yet put forward formal rules regulating how it’s done, meaning privacy and other concerns related to its use are governed by DHS decisions, many of which aren’t public.

According to Rudolph, many airlines participating in the program — such as JetBlue, American Airlines, United Airlines, and Emirates Airlines — have not yet updated their privacy policies to reflect these new developments. And while DHS claims that it deletes photos of U.S. citizens within two weeks of taking them, the agency is under no legal obligation to do so.

“My concern is that a facial recognition rejection can [create] bias,” said Rudolph. “So, if someone has a lot of faith in this technology and thinks that it’s foolproof, and someone is rejected by this system, that customs officer or gate agent may be predisposed to saying this person is traveling with fraudulent credentials. That’s a crime and a serious issue.”

Rudolph’s concerns are well founded. Past studies of top-of-the-line facial recognition systems have shown both racial and gender bias, meaning these systems are more likely to misidentify people of color and women. According to a 2012 IEEE study co-authored by an FBI technologist, the algorithms used in three advanced facial recognition systems all performed less accurately on people who weren’t white.

“Someone who bears a resemblance to a criminal or terrorist will find themselves harassed constantly,” said Jay Stanley, a senior policy analyst with the ACLU Speech, Privacy, and Technology Project. Further rollout of programs like these could “also create records of our movements by law enforcement and companies that can be used to monitor our behavior,” Stanley added.

Americans are technically allowed to opt out of the program, according to a DHS Privacy Impact Assessment, but according to Rudolph, who has attended meetings with customs officials, CPB only sometimes notifies travelers of that opt-out option. The Supreme Court is currently hearing Carpenter v. United States, which in part examines whether it’s fair to assume Americans are fully aware that their cellphone data can be accessed for tracking purposes in criminal proceedings. This is a fact few people consider when using their phones, particularly given how cellphones have become integral to our day-to-day lives. That awareness—and a broader understanding of how our data is used, how it’s accessed, and who is able to do so—is one of the key questions at the heart of the case. The lawyers for Carpenter will argue that we do not have this awareness when it comes to the data we produce. Similarly, when people step up to a camera and have their picture taken at check-in, it’s doubtful that they fully understand the implications of their images being gathered at airports and what they might be used for today—or in coming years. By engaging with the program, and doing so without understanding the potential ramifications, travelers are helping to normalize an abnormal program.

A more traditional means of the biometric exit is fingerprinting. While people may not always understand what their prints are being used for, they at least understand the thought process behind fingerprinting and the extent of what it can be used for. Perhaps we have CSI to thank for that.

“That’s not at all the case with facial recognition,” said Stanley, because it can be done invisibly. “That’s one of the things that make it dangerous. In addition to the fact that we have tens of millions of facial recognition input devices, such as surveillance cameras, further rollout could be centralized, and people are being subjected to face recognition checks without their consent.”

In the decade after 9/11, the United States added approximately 30 million surveillance cameras to its streets, but few citizens notice them nowadays. Regular engagement with facial recognition, even if initially only while traveling, can potentially heighten our tolerance for a whole new type of surveillance, one that can affect us in unexpected and profound ways.

“People still don’t really understand the extent to which these systems are being interwoven into centralized comprehensive surveillance systems, and they don’t feel it yet,” said Stanley. “There is always a lag between when you start to lose your privacy and when you start to realize and feel the consequences. I think we are still in that lag period.”

Rudolph asks people consider the effects that such systems, if they become more widespread, might have in situations like those where citizens descended on airports to protest Trump’s executive order banning certain Muslims from entering the country. Would people have gone if they knew they might be subjected to facial recognition software and their photos stored in a DHS database? Is that a risk people are willing to take when it comes to protesting? Such a scenario is hardly hypothetical. In Baltimore, during the Freddie Gray protests, police used social media monitoring and facial recognition technology to arrest protesters directly from the crowd.

“Now not only do border biometrics have the risk of killing that iteration of speech, but there is legislation in Congress, [like] the TSA Modernization Act, which would take that type of tech and propose to put it throughout the airport terminal environment,” said Rudolph. “So, yes, this unfortunately may be the vanguard of airport surveillance systems that would generate a very serious risk to free speech and free association at airports, and that could have serious consequences for Americans’ First Amendment rights.”

This isn’t the only instance in which DHS hopes to use facial recognition technology. In early November, DHS put out a call for technology that is able to “capture facial recognition quality photos from travelers in order to facilitate identity checks without requiring occupants to leave the vehicle.” Essentially, tech that captures images and runs facial recognition on multiple people at once as they cross the border in vehicles, even if they aren’t looking directly at the camera.

“Eventually people are going to start to wake up and hear about police harassment because of supposedly suspicious activity compiled by a video analytics algorithm combined with a facial rec tech, and they will feel watched, because they are being watched by algorithms everywhere they go,” said Stanley.