Thoughts on Tim Cook’s Open Letter Criticizing Backdoors
Over the past few years, Apple CEO Tim Cook hasn’t been shy about speaking out against government requests for Apple and other technology companies to introduce backdoors into their products (see “Apple and Google Spark Civil Rights Debate,” 10 October 2014). His campaign seems prescient, given that the FBI has now asked Apple to create a new version of iOS that would enable the agency to hack the passcode by brute force, and install it on an iPhone recovered during the investigation of the San Bernardino terrorist attack. Claiming that this is equivalent to the creation of a backdoor in the iPhone, Apple is
fighting the FBI’s request. Apple has published Cook’s “A Message to Our Customers” to share the company’s position.
I encourage everyone to read Tim Cook’s piece; it’s clear and convincing, and it lays out the tensions well. Apple has cooperated with the FBI up to this point in response to legal subpoenas and warrants, and Cook is careful to state both Apple’s respect for the FBI and belief that the FBI’s intentions are good. But he comes out strongly against the FBI’s desire to use the All Writs Act of 1789 to justify an expansion of its authority in a way that would enable it to compel Apple to create the necessary cracking tool (more on that in a bit).
This is completely in line with Apple’s stance on backdoors thus far, as Cook’s introduction to Apple’s privacy policy makes clear:
Finally, I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will.
Backdoors Want to Be Free — Tim Cook’s argument in his open letter, which is supported by both security experts and many members of the intelligence community, is simple. To paraphrase:
Any tool or backdoor that enables a government or law enforcement agency to bypass a product’s inherent security features legally can also be subverted by cybercriminals and hostile governments.
Since we’re talking about the digital world here, any such “master key” would itself be digital data, and would be only as secure as the other protections afforded it. Unlike physical objects, information can’t be restricted to a single location and barricaded inside vaults protected by armed guards.
The information that comprises such a master key would inhabit an uncomfortable space. On the one hand, information wants to be free, and on the other, such information would be of incalculable value.
Let’s unpack that sentence. “Information wants to be free” was initially coined by Whole Earth Catalog founder Stewart Brand in 1984 with regard to the rapidly falling cost of publishing. In 1990, software freedom activist Richard Stallman restated the concept to incorporate a political stance, introducing the concept that what information “wants” is freedom, in the sense that wide distribution of generally useful information makes humanity wealthier. A digital master key would certainly be generally useful, if not necessarily for positive purposes, so Stallman’s point implies that it would be extremely difficult or even impossible to prevent such
information from escaping into the wild.
Preventing such an escape would be made all the harder by the fact that foreign intelligence agencies and criminal organizations alike would undoubtedly pay immense sums of money for access to such a master key. A security exploit broker called Zerodium already paid $1,000,000 for a browser-based security exploit in iOS 9 that it planned to resell to government and defense customers (see “The Million Dollar iOS Hack (Isn’t),” 3 November 2015). And that’s for something that Apple probably found and blocked in one of the updates to iOS 9. Can you imagine how much an iOS master key would sell for? And how that kind of money could corrupt people within the chain of trust for such a key?
Like Tolkien’s One Ring, the power of a master key for millions of phones would be nigh-impossible to resist (unfortunately, Apple’s latest diversity report didn’t break out the number of courageous hobbits working at the company).
Tim Cook is right: any sort of backdoor is a terrible idea that could result in immeasurable harm to the hundreds of millions of people who use Apple products, the overwhelming majority of whom have an entirely legitimate expectation that their private information will be protected from disclosure, regardless of who wants that data.
Worse, because information wants to be free, it’s now well within the capabilities of criminal and terrorist organizations to create and use their own unbreakable encryption software. Even if backdoors were required, the window in which they would be useful would likely be short, as everyone with something to hide switched to something that was guaranteed to be secure.
Why the All Writs Act of 1789 and Why Now? — The one place where Tim Cook’s open letter stumbles slightly is in its description of the FBI’s use of the All Writs Act of 1789 as “unprecedented.”
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
Although the All Writs Act does indeed date to 1789, and is intentionally vague (it enables U.S. federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law”), it has been used regularly in modern surveillance-related cases, some involving Apple. According to Stanford professor Jonathan Mayer in a lecture linked to by Wikipedia, the All Writs Act is used as a catch-all authority for the courts to supplement warrants, but only when:
- There is no other statute or rule that applies.
- It applies to a third party with some connection to the case.
- It is justified by extraordinary circumstances.
- Compliance isn’t an “unreasonable burden.”
It seems relatively uncontroversial that the first three requirements apply to this case. If other, more specific, statutes or rules applied, the FBI would be using them. Apple is a third party with a connection to the case through the terrorists’ use of an iPhone. And a case involving domestic terrorism certainly qualifies as an extraordinary circumstance. Less clear is what involves an “unreasonable burden.”
Before iOS 8, extracting the contents of an iPhone without unlocking it was possible for Apple, and when the company was asked to do so by courts invoking the All Writs Act, it complied. However, in a New York case that’s still pending, the judge suggested that the All Writs Act may be imposing an unreasonable burden, and it’s our understanding that Apple is using that as an opportunity to resist the government order. (For more background, read the EFF’s coverage of the situation.)
To you and me, the FBI forcing Apple to create a dumbed-down version of iOS and install it on an iPhone recovered in the San Bernardino case might seem like an unreasonable burden, if only in terms of the effort involved. But given Apple’s resources, the FBI could argue that what would be unreasonable for most companies would be well within the scope of Apple’s capabilities.
A better argument for Apple would be that the FBI is asking it to develop an actual attack against its own security software, rather than use existing capabilities, as the company had done prior to iOS 8. The “unreasonable burden” test would seem significantly harder to pass if Apple is being asked to do something that’s so far outside its corporate mandate and in direct violation of the company’s own privacy policy. While creating a tool to brute force a passcode wouldn’t technically be a backdoor (not being installed on the device itself), its capability to compromise the security of any iOS device with a Secure Enclave coprocessor would have exactly the same effect, and would be subject to the same desires from bad actors.
Regardless of how terrorists or organized crime might lust after such a tool, with it available, what would prevent law enforcement agencies from asking for it to be employed regularly?
To my mind, then, that explains why Tim Cook chose this moment to write the open letter. Apple has never before been pressured by law enforcement in such a high-profile case, and it’s conceivable that a judge who was ignorant of the technical aspects of digital security could view the circumstances as sufficiently extraordinary and the burden as insufficiently unreasonable to compel Apple to comply. More so than at any time in the past, Cook and Apple need to make clear to everyone the danger of government intervention into legitimate encryption technologies. This case, or one like it, very well may end up in the U.S. Supreme Court, and the more precedent there is of judges being dubious of law enforcement requests, the less likely we
are to end up in a situation where technology companies are forced to crack open smartphones for even trivial crimes.
Is there anything for those of us in the Apple community to do? In his open letter, Tim Cook isn’t obviously asking for anything other than understanding, but I think we should take the opportunity to make sure our elected representatives are as educated as possible about the danger and futility of intentionally compromised encryption software. That’s because Cook notes that the FBI is using the All Writs Act “rather than asking for legislative action through Congress.” Perhaps I’m reading more into those words than was intended, but if Cook views Congress as a future battleground, the more we communicate with our representatives at all levels of government, the better the eventual legislative tussle will go.
Well said!
Perhaps its time to require background checks prior to selling encrypted phones.
A pitty privacy-minded people don't have the lobbying power the NRA has.
Guns don't kill people. No restrictions whatsoever deemed acceptable.
iPhones kill people. Apple needs to break their security asap.
Stunning this is happening in a country that once introduced civil liberties and personal freedom to humanity. Shame.
This FBI request Is just a way to change the focus away from the fact that the government refused to keep the Jihadist out.
Syed Rizwan Farook was born in Chicago and was a U.S. citizen. His wife Tashfeen Malik was born in Pakistan. I don't think immigration policy comes into this debate in a big way.
She came here to do her dirty deed. She came here on a mission and the government did not want to spend the half day's work to check her out. So now they demand that Apple give them the key to my bank in the name of security.
I am listening to the news and other talking heads in the TV world. The average joe doesn't know much and just jumps to the conclusion that Apple can just "hack, unlock" the info that FBI wants -- as usual talking without any informed info. This really disturbs me. I am with Apple.
Straight forward simple law and logic is not enough to manage this case: the whole height, depth and breadth of society functioning as a living system is required to comprehend and manage a workable outcome.
The more countries and commonwealths that join in this case study until resolution the better. The more diversity the better.
Good points, and I don't think I've seen anyone else cover the "high exploit value -> corruptibility" aspect of the leak being inevitable. I'd add that that same high value could lead to a bad entity motivating a leaker via fear of loss as well as hope of gain. If they want it that badly ("they" being powerful organizations known for unarguably evil tactics), they could apply extreme extortion, too.
Yeah, I can't but help think that Apple is in part trying to protect their employees here too. If it became known that Apple programmers could circumvent iOS security, they'd become hot commodities in a variety of bad ways.
Tim Cook says the FBI is acting in good faith. I'm not so sure.
I don't believe the FBI really thinks they'll get anything from this phone. The SB attacker and his wife seem very clearly to be a couple of whackadoos who acted in relative isolation.
Instead of looking for info for this case, I suspect the FBI is using the scary and charismatic nature of it (Homegrown Terrorist! Immigrant Terrorist!) to set a precedent that they can use against, well, pretty much anyone.
I'm not comfortable saying that for publication, since I know nothing more than what has been reported in the mainstream media, whereas I assume the FBI knows quite a lot more and may have told Apple at least some of that. But it really does seem as though the FBI is choosing this opportunity to make a strong stance for encryption appear to be "helping the terrorists."
Tim Cook was just being polite. I'm sure he sees, if not nefarious motives, as least self-serving ones in the FBI's demands. This case is just the tip of an iceberg that could sink personal privacy for good and all—and the rest of our freedoms right along with it. Think I'm being melodramatic? Remember, this was the agency founded by J. Edgar Hoover, probably the most corrupt high law enforcement official in this country's history. He is the best example you could find of the axiom that absolute power corrupts absolutely. The FBI headquarters building is still named after Hoover, despite his degraded reputation. I think it's more than appropriate to entertain a healthy skepticism about the agency's intentions here. Imagine such unfettered power in the hands of a President Trump. If that doesn't give you nightmares nothing will.
I think Adam is right about protecting Apple employees as well. In my opinion that is the very definition of an undue burden. The potential impact on Apple's sales is speculative and would be hard to prove in court, but the problems it would cause people working for Apple are foreseeable, serious—and avoidable.
The point tori Hernandez makes about how uninformed the talking heads in the media are is a good one too. They don't know squat about digital anything but that doesn't stop them from pontificating on behalf of the government here. They swallow the official justification that security trumps any other consideration. The rode to Hell is paved with such ignorant intentions.
How far the fifth estate has fallen. Time was they were considered a bulwark of our liberty. No more.
Great post, Steve. Couldn't have said it better myself.
The subtle problem that every news broadcast misses is that the FBI has the encrypted drive so if they are given the decrypted drive then with the help of any decent mathematician they have the encryption algorithm
For me, the most important part of this is the part of Tim Cook's statement that reads "Criminals and bad actors will still encrypt, using tools that are readily available to them."
Even if tons of tools weren't already available, the mathematical/computer science theory and code to write strong encryption tools is widely known around the world and it's really pretty simple (taught in most introductory CS theory classes). In fact, ISIS has reportedly already done so. See http://www.kitguru.net/gaming/security-software/jon-martindale/daesh-has-its-own-encrypted-chat-app/
So forcing Apple and other manufacturers to artificially weaken it won't help us catch the real "bad guys," it'll just reduce the level of privacy for all of us ordinary folks.
Sounds remarkably similar to the gun argument. If you keep encryption from the good guys, only the bad guys will use it!
Without getting into the politics of the gun debate, I think that's a bit of a logical stretch:
1) Strong encryption programs can be obtained for free, instantly, anytime and anywhere.
2) Once someone develops a strong encryption program, making more copies of it costs literally nothing (zero incremental cost).
3) Developing a brand-new strong encryption program is quick and easy to do, and only takes a little knowledge of the basic theory and mathematics behind it.
None of those are true for guns and ammunition, not even with the rise of 3D printing.
Which means, if you pass a law to keep law abiding citizens from having smart phones that can't be hacked, only the non-law-abiding (including terrorists) will have un-hackable smart phones.
Yup, and even if they make all of the makers of smartphone OSes and hardware include back doors, it'll be trivial for any bad actors to (write if necessary and) install an app that does do strong encryption. So all a law like that would do is help catch the the really stupid criminals.
And the math behind strong encryption is simple enough that trying to outlaw it completely is utterly futile--the cat's already been out of that bag for decades. Essentially, it's simple enough that many people could write good encryption programs purely from memory, with no reference materials needed.
It's all just so ignorant that it drives me nuts.
http://www.xkcd.com/386/
I thought I heard on one news story that the hack the FBI is requesting would not work on the newest iPhones that have the "Secure Enclave", but someone else claims Apple can just update the firmware in the Secure Enclave. If so, then what's the big advantage of the "Secure Enclave"?
There's an interesting article that touches on some of these issues at http://www.zdziarski.com/blog/?p=5645
Zdziarski's article is interesting, but does not seem to address my basic question which is: Could Apple make a hack like this unfeasible, and if so, have they already done so on their newer devices?
I hesitate to speak authoritatively on this, but my understanding is that what the FBI is asking for would apply to all iPhones. That may not actually involve the secure enclave, since what they're asking for is something that would enable brute forcing of the passcode.
But again, I'm not the best source for this information.
The advantage is that Secure Enclave runs isolated from the rest of the phone. Even if you compromise iOS, by jailbreaking the phone for instance, you can't bypass Secure Enclave.
The fact that Apple says all iPhone models are vulnerable suggests Secure Enclave's firmware is updatable. I imagine Apple decided to go this way so that any security holes found could be closed (or bugs fixed) without replacing the entire phone. Obviously, this opens a window through which you can compromise Secure Enclave, but as long as the updating procedure is secure, it seems like a reasonable tradeoff.
Well, at least it did until this latest demand from the government…
I think most everyone is missing what this is really about. I think everyone is falling for for the faming the government or Apple is putting on this.
It is not truly about privacy. Apple will turn over your info if served with a warrant.
It is not truly about data encryption, either. Encryption is legal and this is not about building a backdoor into any shipping iOS or MacOS.
Thes is about the government compelling cooperation from a private party -- in this case, a corporation. What are the limits of cooperation that government can demand? The government can demand "technical assistance," but is the a point where the demand is for something more than technical assistance?
In this case, it think it does. In this case, I think it compelled speech. Heck, part of what the government wants is for Apple to "sign" the update. Literally, they want Apple to endorse this new iOS product, so that it will be understood as coming from Apple.
Compelled speech? That's just not allowed.
"Encryption is legal and this is not about building a backdoor into any shipping iOS or MacOS."
I think this is precisely one of Apple's key points. They don't have to build it into a shipping iOS for it to break their customers' privacy. As soon as they build this for the FBI once - even if it's done in a sealed Apple lab by a few people (the fewer the worse probably) - it's a matter of time until it's available in the wild. And once that happens, iOS users' encryption will be irreversibly broken.
Shipped or not, if this thing has been done once, it will spread. There's nothing Apple or the FBI can do to prevent that other than not hacking it in the first place.
I see Cook's points and generally agree, in the context of present iOS system features and limitations. Going forward, I wonder if there might be a place for customers to have an option to declare that they are "open" to a court-mandated Order if the court itself provides some form of Notice to the customer that it would like to access its contents, and show in iPhone communications that the user has opted for this seemingly mild ("Patriotic"?) option. I, for one, would rather have communications with others so inclined and treat those not so marked with greater care or just ignore them. I have not thought through all of this, and am in a conflict/quandary as to Orders that might be issued by any "secret court" (Federal or otherwise).
Actually two comments: 1) I don't understand how the FBI expects to actually get such a custom iOS loaded on to a passcode-protected iPhone in the first place. Remember that at the factory when iOS is first loaded, there *is* no passcode protection, but after a passcode has been set, you can't load a new iOS without it. 2) Bob Cringely has a particularly interesting take on all this. In his Feb 19 column, he speculates that the game here is actually to force this to the supreme court quickly, before a (probably conservative) successor to Scalia can be appointed and confirmed. That there is a good chance the present court would find in Apple's favor. (www dot cringely dot com)
The Wall Street Journal reported today, "The Justice Department is pursuing court orders to force Apple Inc. to help investigators extract data from iPhones in about a dozen undisclosed cases around the country." These other cases do not involve terrorism.
The claims that have been made by various parties about the San Bernardino case being singular are misguided.
Even if Apple were to do what the FBI is (apparently) requesting, it would not stop a determined individual from protecting their iPhone against a brute force search for its PIN.
Most people probably don't realize that an iPhone PIN can be very long. Suppose you create a 10-digit custom numerical PIN. If the FBI could test PINS at the rate of 1 msec. per PIN, it would take over 3 years to try all possibilities. Each additional digit drives the time up exponentially. But you don't have to resort to long numerical PINs. iPhones can use alphanumeric PINs! So a 6-character PIN composed from upper and lower case characters and numerals would take the FBI over 18 years to search all possibilities.