More Thoughts on Apple’s Stance on Backdoors
Apple’s resistance to the FBI’s request for an iPhone hacking tool can be hard to evaluate because the dispute is taking place on three levels: technical, legal, and policy. None of these is necessarily more important than any other, and focusing on just one or two presents a skewed view of what’s going on.
On the technical level, we now know that Apple could do what the FBI wants, but the company is tremendously worried that any such technology would escape into the wild. It’s one thing to create a digital tool, but an entirely different thing to prevent it from being stolen, subverted, or simply copied. Since this tool could subvert the otherwise unbreakable (as least as far as is known) encryption embedded in the iPhone’s Secure Enclave coprocessor, its creation would undermine the privacy of millions of iPhone users.
Because there’s no actual law requiring Apple to comply or, conversely, protecting Apple from such requests, the FBI is relying on the All Writs Act of 1789. That Act has been used in similar surveillance-related cases in the past, but it’s inherently vague, which means that lawyers and judges could wrangle about the specifics for a long time.
When it comes to policy, it seems clear that the FBI is using this high-profile case of domestic terrorism as an opportunity to further its real goal, which is to be able to gain lawful access to evidence from any smartphone or digital device. And while Apple denies that it’s resisting for marketing reasons, given that Apple is a for-profit business, it’s impossible to know how much of its stance stems from being the right thing to do versus being a competitive advantage or marketing point.
I’m coming down on Apple’s side because I don’t believe the FBI understands the technology well enough to appreciate Apple’s argument about the impossibility of keeping a digital genie bottled up. I also don’t see weakening iPhone security as a long-term win, given that it’s far too easy for any interested organization to create, distribute, and use unbreakable encryption software. While I can’t claim sufficient experience to comment on the legal issues, I dislike the way the FBI is using this case to push its agenda, and while Apple sometimes employs its commitment to privacy in ways that aren’t entirely fair to competitors, that’s a far lesser sin.
With all that said, here are the ExtraBITS we’ve published since my original coverage. They paint a fuller picture of what’s happening and are well worth reading if you’re interested in this case.
Google and Microsoft CEOs Back Apple — Twitter doesn’t lend itself to subtlety or nuance, but Google CEO Sundar Pichai used a 5-part tweet to weigh in on Apple CEO Tim Cook’s open letter to customers about the FBI’s request that Apple create a hacking tool to brute force an iPhone passcode in the San Bernardino terrorism case. Pichai essentially signed on to Apple’s position, saying that Google builds secure products and complies with legal orders to hand over data when possible, but simultaneously expressing concern that requiring companies to enable hacking of customer devices and
data could be a troubling precedent. Microsoft CEO Satya Nadella also commented, though less directly, by retweeting a post from Microsoft President and Chief Legal Officer Brad Smith that summarized the firm’s anti-backdoor position via a linked statement from the Reform Government Surveillance group, of which Microsoft is a member.
Tim Cook’s Open Letter Prompted by the FBI Going Public — Apple’s spat with the FBI over building a cracking tool for an iPhone linked to the San Bernardino terrorism case has taken an interesting turn. The New York Times reports that while Apple had asked the FBI to file its request under seal, the government chose instead to make it public. That supports the theory that the FBI is using this high-profile case of domestic terrorism to pressure Apple into compromising the security of its products. Faced with this PR onslaught, Apple saw no choice but
to take its case for supporting encryption to the public in Tim Cook’s open letter. Sadly, this fight between the FBI and Apple could have been avoided had the assailant’s employer used standard mobile device management tools to maintain passcode control over the work iPhone in question.
A Forensics Expert’s View into the FBI’s Request — The more we learn about the Apple/FBI dustup, the more clear it has become that this is actually a subtle and dangerous game of chess. The latest insight comes from Jonathan Zdziarski, considered to be among the world’s leading experts in iOS-related forensics. In a blog post, Zdziarski explains the difference between “lab services” and developing an “instrument.” Apple has provided one-off lab services in the past to help law enforcement recover data when required by law. But developing an instrument is a tremendously involved, verified, documented,
tested, and validated process. It would require significant resources and would result in the hacking tool being made public and usable by any law enforcement or intelligence agency — along with foreign governments and criminal organizations. That’s why Apple is resisting.
Details Emerge in Dispute between Apple and FBI — In a call with reporters, as covered by TechCrunch, Apple executives clarified a couple of points in the ongoing dispute between the company and the FBI. First, any tool that met the FBI’s desire for software that would enable the brute force cracking of an iPhone’s passcode would also work on newer iPhones with iOS 8 or iOS 9 and a Secure Enclave coprocessor. That’s because the data in the Secure Enclave is encrypted by the passcode, which provides access to everything on an iPhone. Second,
the FBI apparently reset the Apple ID password for the account associated with the iPhone right after taking the iPhone into custody. That prevented any further automatic iCloud backups, which Apple could have turned over to the government.
Apple Answers More Questions about FBI Court Order — As speculation swirls about the implications of Apple resisting the FBI’s request for help accessing an iPhone used by one of the assailants in the San Bernardino terrorism case, the company has posted another public statement answering some of the questions that have arisen. In particular, Apple explains why it objects to the government’s court order, whether or not the FBI’s request is technically feasible, why such a technical solution could not be contained, and more. One particular note — despite what has appeared in the media, Apple says it has
never previously unlocked iPhones. Prior to iOS 8, Apple had extracted unencrypted data from locked iPhones for law enforcement, using a technique that didn’t require them to be unlocked. The encryption in iOS 8 and iOS 9 makes that impossible now.
One thing I haven't heard anybody say is that, if Apple does doe this and the government allows Apple to keep the software (i.e. not release it), it still sets a dangerous precedent because the government can come back time and time again to have Apple "backdoor" yet another iPhone. "In for an inch, in for a mile"
According to an experienced computer forensics expert, the danger lies in the fact that Apple will be creating a forensics tool. If the evidence gathered by this tool is to hold up in court to prosecute other terrorists it will need to be verified by many other parties. These include independent computer forensics firms and attorneys representing the accused.
Under these circumstances, how long do you think it will be before someone sells an illicit copy to the highest bidder?
Tim and Apple's legal team know this will not end well for everybody, not just the victims in San Berdoo.
As soon as there is a "back door" the criminals will be exploiting it before the NSA or CIA even get started.
A perfect example of tomfoolery and deception by the FBI; San Bernardino is an excuse to promote a goal that our state security apparatus has long had: access to everything they think they might need, regardless of what you might believe is protected content. If Apple yields, it will be the camel's nose under the tent. Once his nose is in, the rest of the camel will follow.
If you'd like to support Apple's stance on privacy -- contact your local government reps and add your name to the White House petition at https://petitions.whitehouse.gov/petition/apple-privacy-petition
I think another issue that Apple would face is the involved employees becoming high-profile targets for all kinds of nefarious activity.
Imagine Apple is compelled to hack this iPhone. They use very few staff members in order to keep the number of people with the knowledge limited. Even if the FBI is present, even if they keep everythign confidential, even if they'd never come back with more iPhones (which we know they will), what kind of price tag does that put on these few Apple employees who were a part of the operation and now know how to hack into a locked iPhone?
What's to prevent Putin's FSB to poach one of these guys, or the North Koreans from kidnapping another (unlikely as that may be)? How does Apple make sure the involved staff stays safe and stays away from the 'black economy', organized crime, or foreign intelligence services?
As stated in my reply to Michael above, the legal nature of this forensics tool would necessitate many third parties having access to it. The evidence gathered would otherwise be inadmissible in court.
That vastly increases the risk of this being sold into the wrong hands. One security company has already paid a million dollars for one iOS zero day exploit. I can only imagine how much a backdoor hack would fetch.
If this were Feb 2014 with only iOS 7 on the iphone, would this dispute still have arisen?