Unpacking what a win for the FBI would mean in the Apple case

Photo by Beni Krausz

How far must a technology company go to help law enforcement? This is now the big question, after a federal court recently ordered Apple to help the FBI break into a San Bernardino gunman’s iPhone. Apple — who just formally responded to the court’s order — has publicly opposed the FBI’s request, arguing that the order would set a “dangerous precedent” and “has implications far beyond the legal case at hand.” But what, exactly, would those implications be for social justice, and how would this potential precedent play out in the real world?

The debate is about much more than this one iPhone, in this one case.

If Apple can be forced to help the FBI break into this device, expect state and local law enforcement to ask Apple — and other tech companies — to do the same. Just this week, it was revealed that Apple has objected or challenged at least 12 other government requests over the past few months to extract data from locked iPhones.

From Apple’s ongoing litigation in the Eastern District of New York. Source: https://www.documentcloud.org/documents/2718704-Zwillinger-Chart.html#document/p1

There’s no reason to think these requests are limited to terrorism cases.

They could be related to drug investigations, or other law enforcement activities. We don’t currently know the nature of these other cases, but members of local law enforcement have already been vocal about using similar tools. According to the New York Times, Cyrus Vance Jr., the District Attorney for Manhattan, “reject[s] the notion that Apple should be forced to cooperate only in certain prominent crimes.” “There are tens of thousands of other cases around the country … where data is going to be on smartphones that prosecutors and police officers need to access,” Vance told Charlie Rose.

There is a clear historical pattern of counterterrorism tools and tactics trickling into the hands of local law enforcement. Fusion centers — which were initially created in response to 9/11 to combine federal, state, and local surveillance data — now largely work in support of local law enforcement efforts. Stingrays, devices that are used to track the location of cell phones, were originally developed for military and intelligence applications. But now — thanks in part to grants from the Department of Homeland Security — local law enforcement often use the devices “to locate the perpetrators of routine street crimes.” A favorable precedent for the FBI in this case could become yet another example of a much larger national trend: the use of counterterrorism tools and tactics for local law enforcement purposes.

This is about the whole technology industry — not just Apple.

The outcome of this debate would affect many other companies and secure communications tools, like Signal — an encrypted messaging service that many in protest movements like Black Lives Matter rely on, knowing that there is a real risk of government surveillance otherwise. As Nate Cardozo, a staff attorney at the Electronic Frontier Foundation, argues, if the FBI’s argument succeeds in the San Bernardino case, then we may be heading down a slippery slope where services like Signal would be required weaken their encryption protocols so law enforcement could surreptitiously surveil their users. As NPR’s Alina Selyukh writes, “the case is being watched by any company whose customers’ data privacy depends on their security software updates, which is a unique universe of encrypted messaging services like WhatsApp and Signal but also software-makers like Apple and Google.”

Apple is reportedly already working to engineer an even more secure iPhone — so secure that even Apple could not help the FBI extract data even if it wanted to. Once this happens, there will be an even bigger public debate about whether tech companies ought to be allowed to build technologies so secure that even they can’t get into them — a debate that law professor Orin Kerr recently previewed.