This article originally appeared in TidBITS on 2016-02-17 at 11:35 a.m.
The permanent URL for this article is:
Include images: Off

Thoughts on Tim Cook’s Open Letter Criticizing Backdoors

by Adam C. Engst

Over the past few years, Apple CEO Tim Cook hasn’t been shy about speaking out against government requests for Apple and other technology companies to introduce backdoors into their products (see “Apple and Google Spark Civil Rights Debate [1],” 10 October 2014). His campaign seems prescient, given that the FBI has now asked Apple to create a new version of iOS that would enable the agency to hack the passcode by brute force, and install it on an iPhone recovered during the investigation of the San Bernardino terrorist attack [2]. Claiming that this is equivalent to the creation of a backdoor in the iPhone, Apple is fighting the FBI’s request. Apple has published Cook’s “A Message to Our Customers [3]” to share the company’s position.

I encourage everyone to read Tim Cook’s piece; it’s clear and convincing, and it lays out the tensions well. Apple has cooperated with the FBI up to this point in response to legal subpoenas and warrants, and Cook is careful to state both Apple’s respect for the FBI and belief that the FBI’s intentions are good. But he comes out strongly against the FBI’s desire to use the All Writs Act of 1789 to justify an expansion of its authority in a way that would enable it to compel Apple to create the necessary cracking tool (more on that in a bit).

This is completely in line with Apple’s stance on backdoors thus far, as Cook’s introduction to Apple’s privacy policy [4] makes clear:

Finally, I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will.

Backdoors Want to Be Free -- Tim Cook’s argument in his open letter, which is supported by both security experts and many members of the intelligence community, is simple. To paraphrase:

Any tool or backdoor that enables a government or law enforcement agency to bypass a product’s inherent security features legally can also be subverted by cybercriminals and hostile governments.

Since we’re talking about the digital world here, any such “master key” would itself be digital data, and would be only as secure as the other protections afforded it. Unlike physical objects, information can’t be restricted to a single location and barricaded inside vaults protected by armed guards.

The information that comprises such a master key would inhabit an uncomfortable space. On the one hand, information wants to be free, and on the other, such information would be of incalculable value.

Let’s unpack that sentence. “Information wants to be free [5]” was initially coined by Whole Earth Catalog founder Stewart Brand in 1984 with regard to the rapidly falling cost of publishing. In 1990, software freedom activist Richard Stallman restated the concept to incorporate a political stance, introducing the concept that what information “wants” is freedom, in the sense that wide distribution of generally useful information makes humanity wealthier. A digital master key would certainly be generally useful, if not necessarily for positive purposes, so Stallman’s point implies that it would be extremely difficult or even impossible to prevent such information from escaping into the wild.

Preventing such an escape would be made all the harder by the fact that foreign intelligence agencies and criminal organizations alike would undoubtedly pay immense sums of money for access to such a master key. A security exploit broker called Zerodium already paid $1,000,000 for a browser-based security exploit in iOS 9 that it planned to resell to government and defense customers (see “The Million Dollar iOS Hack (Isn’t) [6],” 3 November 2015). And that’s for something that Apple probably found and blocked in one of the updates to iOS 9. Can you imagine how much an iOS master key would sell for? And how that kind of money could corrupt people within the chain of trust for such a key? Like Tolkien’s One Ring, the power of a master key for millions of phones would be nigh-impossible to resist (unfortunately, Apple’s latest diversity report didn’t break out the number of courageous hobbits working at the company).

Tim Cook is right: any sort of backdoor is a terrible idea that could result in immeasurable harm to the hundreds of millions of people who use Apple products, the overwhelming majority of whom have an entirely legitimate expectation that their private information will be protected from disclosure, regardless of who wants that data.

Worse, because information wants to be free, it’s now well within the capabilities of criminal and terrorist organizations to create and use their own unbreakable encryption software. Even if backdoors were required, the window in which they would be useful would likely be short, as everyone with something to hide switched to something that was guaranteed to be secure.

Why the All Writs Act of 1789 and Why Now? -- The one place where Tim Cook’s open letter stumbles slightly is in its description of the FBI’s use of the All Writs Act of 1789 [7] as “unprecedented.”

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

Although the All Writs Act does indeed date to 1789, and is intentionally vague (it enables U.S. federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law”), it has been used regularly in modern surveillance-related cases, some involving Apple. According to Stanford professor Jonathan Mayer in a lecture linked to by Wikipedia, the All Writs Act is used as a catch-all authority for the courts to supplement warrants, but only when:

It seems relatively uncontroversial that the first three requirements apply to this case. If other, more specific, statutes or rules applied, the FBI would be using them. Apple is a third party with a connection to the case through the terrorists’ use of an iPhone. And a case involving domestic terrorism certainly qualifies as an extraordinary circumstance. Less clear is what involves an “unreasonable burden.”

Before iOS 8, extracting the contents of an iPhone without unlocking it was possible for Apple, and when the company was asked to do so by courts invoking the All Writs Act, it complied. However, in a New York case that’s still pending, the judge suggested that the All Writs Act may be imposing an unreasonable burden [8], and it’s our understanding that Apple is using that as an opportunity to resist the government order. (For more background, read the EFF’s coverage [9] of the situation.)

To you and me, the FBI forcing Apple to create a dumbed-down version of iOS and install it on an iPhone recovered in the San Bernardino case might seem like an unreasonable burden, if only in terms of the effort involved. But given Apple’s resources, the FBI could argue that what would be unreasonable for most companies would be well within the scope of Apple’s capabilities.

A better argument for Apple would be that the FBI is asking it to develop an actual attack against its own security software, rather than use existing capabilities, as the company had done prior to iOS 8. The “unreasonable burden” test would seem significantly harder to pass if Apple is being asked to do something that’s so far outside its corporate mandate and in direct violation of the company’s own privacy policy. While creating a tool to brute force a passcode wouldn’t technically be a backdoor (not being installed on the device itself), its capability to compromise the security of any iOS device with a Secure Enclave coprocessor would have exactly the same effect, and would be subject to the same desires from bad actors. Regardless of how terrorists or organized crime might lust after such a tool, with it available, what would prevent law enforcement agencies from asking for it to be employed regularly?

To my mind, then, that explains why Tim Cook chose this moment to write the open letter. Apple has never before been pressured by law enforcement in such a high-profile case, and it’s conceivable that a judge who was ignorant of the technical aspects of digital security could view the circumstances as sufficiently extraordinary and the burden as insufficiently unreasonable to compel Apple to comply. More so than at any time in the past, Cook and Apple need to make clear to everyone the danger of government intervention into legitimate encryption technologies. This case, or one like it, very well may end up in the U.S. Supreme Court, and the more precedent there is of judges being dubious of law enforcement requests, the less likely we are to end up in a situation where technology companies are forced to crack open smartphones for even trivial crimes.

Is there anything for those of us in the Apple community to do? In his open letter, Tim Cook isn’t obviously asking for anything other than understanding, but I think we should take the opportunity to make sure our elected representatives are as educated as possible about the danger and futility of intentionally compromised encryption software. That’s because Cook notes that the FBI is using the All Writs Act “rather than asking for legislative action through Congress.” Perhaps I’m reading more into those words than was intended, but if Cook views Congress as a future battleground, the more we communicate with our representatives at all levels of government, the better the eventual legislative tussle will go.