Law enforcement and the intelligence community needs to get past their desperate obsession with “Going Dark” and innovate along with the rest of the world in order to cope with the new digital reality of the 21st century.
The congressional House Homeland Security Committee this week published a majority staff report called Going Dark, Going Forward: A primer on the encryption debate that purports to be an evenhanded overview of the issue. (All quotes below are from the report.) Unfortunately, despite what must be best intentions at objectivity, the report remains mired the old thinking and demonstrates a profound lack of understanding of technology.
The Executive Summary
Right off the bat, the first sentence is simply not true.
Public engagement on encryption issues surged following the 2015 terrorist attacks in Paris and San Bernardino, particularly when it became clear that the attackers used encrypted communications to evade detection …
The FBI/Apple dispute over the San Bernardino iPhone was over defeating the encryption protecting stored data, not encrypted communications. Law enforcement access to communications falls under Communications Assistance for Law Enforcement Act (CALEA) which does not apply to stored data: the distinction is a critical one for any serious treatment of this issue and they get it clearly wrong.
Details of the Paris attackers methods are unclear but there are many credible reports suggesting that government claims of encryption use are baseless. Obviously, since encrypted data is indecipherable, it cannot be identified unless decrypted so such claims are inherently speculative.
Since deep secrecy invariably envelops these investigations it’s impossible to confirm any facts — which is very much part of the problem because these claims cannot be disputed. The report’s statement that the attackers use of encryption “became clear” is therefore deceptive if not false.
What is clear already is that the anonymous authors of this report have already drunk the law enforcement KoolAid, whether they know it or not. This is no primer: it’s an attempt to frame the issue and limit debate to parameters acceptable to the law enforcement status quo position.
Some have framed the debate surrounding encryption as a battle between privacy and security. Our extensive discussions with stakeholders, however, have led us to conclude that the issue is really about security versus security…
Rejecting the concerns of the privacy community out of hand, the report offers its conclusory alternative framing with a breath-taking assertion dismissing all notions of personal privacy as a being at issue at all. The authors choose instead to focus on the government’s need for strong encryption to protect its assets and infrastructure versus law enforcement’s backdoor access. Since the private communications of citizens have no national security import, that our privacy might be violated by pervasive surveillance is nothing to be concerned about.
The rest of the executive summary assures us there are no “simple solutions” due to “troublesome trade-offs” and “unintended consequences”. I think it’s safe to say that we knew that. Perhaps most disappointingly, all the report suggests doing is forming a committee of “experts”.
Please, not another expert committee
As the report correctly points out,
legislative proposals seem to determine clear “winners” and “losers”
This is also going to apply to the possible conclusions of this new committee which will be stocked with potential winners and losers. What exactly other than a legislative proposal could a Congressional committee produce?
Let’s look at the constituency of the proposed McCall/Warner committee:
- Cryptology, Global commerce and economics, and technology sectors
- Federal, state and local law enforcement
- Intelligence Community
- Privacy and civil liberties community
It doesn’t take a crystal ball to foresee the likelihood that privacy and civil liberties advocates will be very influential in this committee is nil. These are the same experts who have been batting this around to no avail for years, so why does anyone dream that this new group will do any better?
How will the meetings of this committee likely go? The report itself tells us there is a considerable level of rancor between camps around this issue.
What’s more, many stakeholders involved in the discussions surrounding this issue feel their motives, patriotism, and even their intelligence are called into question by those who oppose their point of view.
Nonetheless, the report insists that yet another committee will somehow overcome this and engage in new and productive discussions finding a mythical compromise solution.
Not a new problem
According to FBI Director James Comey, “Going Dark” refers to the
phenomenon in which law enforcement personnel have the “legal
authority to intercept and access communications and information
pursuant to court order,” but “lack the technical ability to do so.”
First, it’s important to understand that the “lack the technical ability” is actually an old problem, not a new technical phenomenon. Legal searches have always been limited in what evidence they can find and interpret, be it whispered conversations or use of an unknown language.
Scribbled notes passed hand to hand, “dead drops”, hushed conversations in public parks: these all have “gone dark” ages ago. Used of espionage or terrorism, law enforcement would certainly have the legal authority to access this sort of information but has always lacked the technical means to do so unless there is a bug or camera at just the time and place.
Modern digital technology simply makes concealing information easier and more pervasive, and it enables instant global communication. But that same technology also provides incredible opportunities for smarter investigations — that’s the direction the government needs to go.
The report lists seven general findings — actually there are five findings which are straightforward and uncontroversial, plus a couple of ringers.
- Encryption is widely used with modern digital technology.
- Law enforcement investigations regularly encounter encrypted data.
- Technology, including encryption, is essential to protect infrastructure.
- These issues are worldwide.
- Legislative proposals have failed, being either too heavy-handed or ineffective.
From here they shift to baseless opinion, not findings at all.
6. The impacted parties themselves need to directly engage one another in an honest and in-depth conversation in order to develop the factual foundation needed to support sustainable solutions.
In my view the likelihood of (for example) the FBI sitting down with Apple engineers and finding a solution everyone likes are very slim to none, and those sort of meetings are hardly necessarily the answer. There is a fundamental conflict of goals that dialog simply will not resolve. While dialog may possibly help, I’m saying it isn’t a given that dialog is the answer.
7. The debate surrounding the abuse of widely available encryption technology is part of a larger question of ensuring that law enforcement and national security efforts keep pace with technological advancement without undermining American competitiveness and American values.”
This is actually law enforcement’s ultimatum — not a finding at all — and it’s the reason that no progress is being made. This would couch the issue as being about how to give law enforcement what it wants. However, when you recognize that they never did have the technical means to capture all bad guy communications and documents in the first place, it’s clear that there’s nothing to “keep pace” with. This is pure wishful thinking — the last thing we want from our law enforcement community.
The bad guys will always have ways of getting around law enforcement investigations, going back to ancient times. Technology advances and that provides better tools and more opportunities for abuse. This is not something we can “fix”. The only way to avoid it is to take away everyone’s rights, or to attempt banning anything that might be dangerously abused.
Ancient laws and high tech don’t mix
The report notes but fails to see anything about the wisdom of applying 18th century law to a 21st century technology problem.
The government has relied on the 1789 All Writs Act (“AWA”) to help law enforcement gain access to certain encrypted communications.
It goes without saying that the authors of the original legislation could not conceivably have imagined smartphones and the internet, much less the digital technology that makes it possible. That this law compelling “assistance” is even being applied must be due to what can only be a poor understanding of the technology by modern judges issuing these orders.
Strong security properly implemented cannot be subverted even by its makers. That’s a basic requirement for serious security, it’s only negotiable if you want to compromise the level of security.
The notion of compelling assistance is also not a new one. Asking Apple to break the encryption of a customer device is like asking a stationary goods maker to interpret some unintelligible scribblings that a customer has made, or asking the telephone company to assist when they can’t make out what is being said on a wiretap.
Privacy is not an option (the report says)
Again the report authors are selling the law enforcement position as impartial fact in summarizing the privacy aspect of this issue.
Yet others have suggested it is necessary to sacrifice some level of privacy to ensure that Americans are kept safe from harm.
That’s such a terrible sentence: an anonymous suggestion, couched in indirect language, it nonetheless asserts a necessity — curtailing our civil rights, no less — without basis as if it were a mathematical truth.
Moreover, courts generally agree that there is no absolute right to privacy in America…
The argument that “there is no absolute right to privacy in America” is a real red herring. If we even knew what constituted an “absolute right to privacy” it would presumably limit the government to only such private information that citizens might choose to disclose. Clearly this is a nonsensical a strawman that nobody seriously suggested. That our privacy is certainly subject to some constraints by no means justifies further erosions at the convenience of the government.
Nonetheless, this report is continues to stake out pro-law-enforcement positions as given for the ensuing debate. Why exactly is sacrifice necessary? The CIA Director says, “I don’t know what the best way is [to solve the encryption question]”, and until we know the best answer we should not rule out possibilities and pontificate about what might be necessary.
The Director doth protest too much, methinks
“There is no doubt that the use of encryption is part of terrorist
tradecraft now because they understand the problems we have
getting court orders to be effective when they’re using these mobile
messaging apps, especially that are end-to-end encrypted.”
— FBI Director James Comey (December 9, 2015)
First off, describing the use of encryption as “terrorist tradecraft” is extremely overblown rhetoric and a lame attempt to make it sound scary. Anyone using social media or even just Google Search is using encryption whether they think about it or not. The term tradecraft implies an arcane technology requiring special expertise and equipment beyond the ken of regular citizens. It’s a disturbing thought to even imagine that the FBI doesn’t thoroughly understand by now that digital communication is encrypted by default and not a specialized technique, yet here we have it unambiguously presented straight from the top in black and white.
It’s also a very twisted perspective from federal law enforcement in thinking that bad guys have such nuanced perspectives about the functioning of our courts. Bad guys use encryption to evade detection in the first place, not as a line of defense after the FBI is on to them and filing court orders. Bad guys will continue use encryption no matter what expert committees do and whatever laws are passed, because it’s the smart thing to do for any private communication. And even if the FBI gets backdoors in all commercial software then they will find or make their own — it isn’t hard and there are countless copies of the tools out there.
It’s one thing for law enforcement to stake out a position and lobby for it, but it’s another for the people responsible for protecting us to entertain fantasies that are simply unrealistic and make no sense. It’s a huge waste of time as well as a diversion of effort and resources away from important investigation and enforcement work that might actually bear fruit. I would hope that internally more reasoned and pragmatic thinking prevails, and it’s understood that the public gesturing is just for political show. Even so, don’t we need to have honest and open discussions of realistic positions in order to make serious progress on this issue?
The dirty secret the report never mentions
Here’s one key part that the FBI never talks about. Let’s suppose the FBI gets their way and the entire US software industry converts to 100% certified backdoor encryption enabling law enforcement to instantly read every bit of data in all of our networks, computers, and devices.
Will the bad guys give up? Will the bad guys go ahead and use these systems knowing they are exposed, and hope for the best? Are the bad guys dumb enough to think they can plot their evil schemes with these tools anyway?
No, no, and no — with possible exception of a few incompetents on that last one. Even if Americans agree to compromise their own privacy rights, most of the rest of the developed world does value personal security and you can bet that demand will be met in the market with backdoor-free strong encryption. Then we will have sacrificed our high tech competitiveness, our personal privacy, the freedom to innovate in software, and hardly slowed down the bad guys at all: that’s the FBI’s “best case proposal” — seriously.
A former dirty secret, now common knowledge
Up until mid-2013 US law enforcement actually had its cake and ate it too. Until the Snowden leaks revealed the truth, the world scarcely imagined the scale of pervasive digital data surveillance by the US government.
Things were really ideal for the intelligence community back then. Encryption algorithms that we now know to be flawed — effectively with secret backdoors built in, such as Dual_EC_DRBG— had been perpetrated as government standards, allowing behind-the-scenes decryption. The bad guys were indeed tricked into trusting this stuff, precisely because the public was also deceived, which gets to the core of the problem. Yet this was only possible exposing everyone to the risk that knowledge of the stealth backdoor weakness once leaked would expose it all to being broken.
Law enforcement wants the bad guys to think their systems are secure but still be able to gain access. The only way to do that is for the public also to be so deceived. Now that this has been revealed (three years later, with new revelations continue to come) there is no going back — the public is on to the game and not going to put up with it again. It was an audacious act by the intelligence community, and it took an audacious leak to expose it and start the process of repairing the damage done to public trust, but there’s no going back now.
While some claim that sacrifice is necessary, you can bet they are referring only to public sacrifices and not about to change the ways of the federal intelligence community and law enforcement who are all too eager for the public to give up more of their rights.
As the report correctly states,
we must first develop a … common understanding of what the problem actually is
How can the industry and the public possibly have a balanced exchange of views when the government so pervasively cloaks every aspect of their side of this in secrecy? Even after the Snowden disclosures the government only begrudgingly confirms scant details of its surveillance operations which are still overseen by the secret FISA court. Yet oversight simply isn’t working: we have learned that intel supposedly only for national security level investigations is covertly channeled to support drug cases.
We need the government to embrace a whole new level of transparency and open up the black curtains surrounding how all of this works. Without compromising the actual data there is no reason they cannot publicize full details of the architecture and connectivity of their systems, internal policies controlling access and authorization, how accidental and intentional abuse is mitigated and audited, metrics of volumes of data and searches, statistics on investigation and prosecutions supported including outcomes, schema of data feeds and databases, data retention policies, and even how they use encryption (presumably without any backdoors) to safely and effectively protect what must be among our country’s most sensitive data.
I’m well aware this has not even been seriously considered and the intelligence establishment surely finds such a proposal ludicrous. Perhaps they have good reason to think so but without more public disclosures how would we even know that is so? The Snowden disclosures have proved conclusively that these agencies are not above lying to the Senate about these matters, so blind trust just isn’t good enough anymore.
Industry could be more transparent, too, though relatively it’s a minor part of the whole issue. Much of the technical infrastructure that digital products and services are built on is already well documented and open sourced (e.g., OpenSSL, kernel.org, etc.) but big chunks remain proprietary. Digital Rights Management continues to frustrate efforts at openness, and restrictive laws such as DMCA dissuade security researchers from doing good work and complicate the disclosure of their results.
Nonetheless, the requirements for customer-facing apps are clear and major software makers today generally have competent staff ensuring the security and privacy of products and operations with integrity. The government would do well match that bar, and there are good efforts such as the US Digital Services working hard to move in that direction but there is far to go.
The status quo is largely secrecy through obscurity, rightfully derided by the information security community as wrongheaded. Yes, there are limits to openness and it requires careful consideration and analysis to make those choices, but that is exactly the kind of thinking this system desperately needs. It’s too easy to just draw the curtain over everything and that becomes a slippery slope reflexively applied over time. We can do better.
American strength and pride
American freedom and democracy are what make this country strong.
We value free speech enough to let extremists groups publicly parade or publish insidious and hateful diatribes, because as a society we are strong enough to tolerate it and we understand that suppression is not the answer.
Our legal system is full of checks and balances — the 5th Amendment, Miranda rights, and many more — that may rarely result in criminals going free, because we do not tolerate even a remote chance of convicting an innocent person. Law enforcement accepts these restrictions and is better for it. Though the system remains imperfect it’s far better than having an unchecked legal system vulnerable to going awry.
National security has become a pervasive “trump card” (especially since 2001 and the Patriot Act) and we have become too quick to agree to anything in order that everything possible can be done. The big world is indeed a scary place but total secrecy about every aspect of the battle to protect us leaves us to merely imagine the threats and countermeasures. We cannot sanely balance the trade-offs this report correctly represents as inherent while everything is kept in the dark from us, entrusted to shadowy organizations.
Report Appendix: legal access standards
The report contains a handy appendix summarizing the legal standards required for law enforcement to gain access to the private information of citizens. This is offered without explaining the basis of access, nor any justification of why we are wise to give up our privacy rights so readily. There are three categories of private information subject to access:
- Relevance: Name/Physical Address/#, Call Times/Duration, Billing Records (I believe # is phone number)
- Reasonable Suspicion: To/From IP Addresses, Transactional Records (apparently this includes the internet traffic your ISP sees but is not well defined)
- Probable Cause: Emails, Text Messages, Pictures (and contents of your PC or smartphone)
We desperately need good transparency here beginning with exactly how this law is interpreted and implemented. What checks and balances are in place to ensure that vague standards like “relevance” and “reasonable suspicion” are strictly complied with? How much are these new powers used and by whom for what kinds of investigations? How often is the private information of innocent citizens compromised as a result of these extraordinary accesses?
Probable cause is presumably checked by judges issuing warrants but how well informed is the judiciary about the technical aspects of this data collection? Do judges simply accept as fact what they are told or do they confirm anything independently? How can secret reviews by the FISA court prevent abuses when by design the judge is only hearing one side of the story?
Some companies have begun to disclose publicly numbers of requests they receive under gag orders but when will there be a national accounting of who often and against how many people domestic and foreign? Accepting that specifics cannot be published for the obvious reasons, what about eventual disclosure within seven years (or longer as appropriate) when no action results from the intrusion?
Next step: transparency, not another committee
I believe that the next step is the sunshine and fresh air of a serious effort and transparency by the government. That — not another committee of experts — would significantly allow the rest of us to see what the government is doing, and how and why, and might lead toward a better common understanding and finding a way forward.