One small feature in iOS 8, barely mentioned during the keynote at Apple’s Worldwide Developer Conference, has engendered a major civil rights controversy. Apple said that, starting with iOS 8, all Apple-provided apps would be encrypted with Data Protection, which entangles your passcode with a special ID embedded in your device, that not even Apple can recover.
The implications were clear for months, but sprang into the public consciousness only after Tim Cook released an, highlighting that the technology would nearly eliminate the possibility of Apple accessing data on your device, even if compelled to try by law enforcement or the government (see “ ,” 24 September 2014). This was quickly followed by , “What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law.”
Other law enforcement officials followed with their own condemnations., “Apple will become the phone of choice for the pedophile.” Ouch.
Before I go on, there are three things you need to know about me:
My day job is advising companies and governments, big and small, about how to improve their security. One of my specialties is mobile devices and another is cloud computing.
As an emergency responder (formerly a full-time paramedic, also with firefighting and mountain rescue experience), I have more than once had to tell parents that their child had died, often for easily avoidable reasons.
I have three small children of my own.
Perspective is important. As writers, we are always biased by our experiences, especially with emotional issues that pit our civil rights against fundamental fears for ourselves and others. We shouldn’t vilify law enforcement officials who see access to our phones as important to their mission to protect us, yet we also can’t allow low-frequency statistical events, however great the emotional impact, to create even greater, more generalized risks.
iPhones Have Long Frustrated Law Enforcement -- When the iPhone first came out, I was one of a group of analysts who advised enterprises to avoid it, largely due to security concerns. The original iPhone didn’t really have any security, but it also lacked apps. Since then, Apple has dramatically improved the resiliency of the platform from attacks, even if someone has physical possession of the device.
This level of security comes thanks to how Apple implemented Data Protection in iOS. Your iPhone is always encrypted, but in a way that is easy to circumvent. Data Protection enhanced the basic encryption by entangling your passcode (if you set one) with a unique identification number burned into the hardware of the device. There is no way to pull that code off the device, which dramatically increases the length of the “total” passcode that protects your encryption key.
One of the more effective techniques to crack phone encryption is to pull the data off the source device, and then brute-force attack it from larger, more powerful computers. By adding that device code into the mix, Apple makes it effectively impossible to crack iOS encrypted data on even the biggest computers, even if you can get the image off the device. Thus most forensics tools used by law enforcement (and criminals) have to crack the encryption on the iPhone (or iPad) itself, using the embedded processor that mixes the device code back in. However, this hardware is rate-limited, so once a passcode hits about six characters, it could take many years (even centuries) to break via brute force.
Data Protection has been around for at least three revisions of iOS devices, but for most of that time it protected very little: email (which the police can get from servers) and any apps that turned on the Data Protection API. Apple has never been able to access this data… for anyone. However, plenty of other data was exposed, including text messages, photos, app data that didn’t turn on Data Protection, contacts, and much more.
, because they were major security headaches for my enterprise clients. If you really wanted to secure data on an iOS device, you needed to be extremely careful and use all sorts of additional security controls. That’s why I traveled to China using only “clean” devices that didn’t have access to my normal accounts or data.
iOS 7 expanded Data Protection to all third-party apps (even those that don’t use the Data Protection API), and iOS 8 expanded it to all Apple-provided apps, including Messages, Camera, etc.
Apple didn’t deliberately lock out law enforcement to protect pedophiles, it added a much-needed security control to protect all of us from common criminals.
The Bane of Backdoors -- It shouldn’t be a surprise that some law enforcement officials consider it their right to access our phones. Since the dawn of civilian law enforcement, we have granted police investigative powers: the ability to access anything outside our heads, with a proper court order.
With probable cause, the police can read our mail, enter our homes, listen to our phone calls, and more. That’s how many of them view the world — not with totalitarian malice, but as a legal privilege entrusted to them to protect society. Get the warrant, get the order, get the information. It’s all part of the legal process.
But technology advances have created a modern society that is no longer so clean. There is a term in the defense world called “dual use.” It describes technologies that can be used for good or evil. As a ski patroller, I used explosives to trigger avalanches. The same explosives are used in construction and in some military munitions.
Law enforcement, especially federal law enforcement, has a history of desiring and imposing backdoors into technology. The of 1994 requires all telecommunications equipment manufacturers to enable remote wiretapping for law enforcement in the hardware. But CALEA backdoors have also been .
Last year,, which would require any Internet communications service to include a backdoor for direct law enforcement wiretapping. All chat programs, social networks, and even webcasting tools would have had to provide a secret entrance for law enforcement.
Even dismissing concerns over NSA monitoring, it is technologically impossible to build such backdoors without the possibility of abuse by attackers. All wiretapping interfaces are dual use, and subject to abuse. No one has ever created a monitoring technology immune from attack or misuse.
In a short-sighted attempt to maintain existing investigative capabilities, the FBI is pushing to reduce communications security for the entire Internet, which would also destroy the competitiveness of U.S. companies globally. Then again, for not providing U.S. law enforcement private email messages from a customer in Ireland, from a data center in Ireland, even though doing so would violate Irish law. It’s a court case that could decimate the global market for all U.S.-based cloud computing providers.
A Rusty Key -- On 3 October 2014, the Washington Post Editorial Board to provide law enforcement, and only law enforcement with a court order, access to phones. That isn’t something I’m opposed to in concept — except it is technologically impossible.
I don’t know a single security expert or cryptographer who believes a secure golden key is viable. Especially not one that would be accessible to a range of law enforcement officials, over time, for various cases. It simply can’t be done. The closest equivalent is the. That’s a system that works only due to their geographic distribution, and infrequent meeting requirements. A law enforcement equivalent could never handle ongoing law enforcement needs.
And that ignores the international implications of creating a backdoor to all phones, one that could be abused by any government.
Apple clearly sees its security as a competitive advantage,. Especially since Google, by nature of its business model, can never maintain privacy as well as Apple, despite recent failures, no matter how strong its security capabilities. (Remember that privacy and security are not the same thing!) But all that marketing doesn’t eliminate the fact that Apple closed a long-known security flaw first, and leveraged the marketing later.
Attorney General Eric Holder claimed. I have witnessed such abuse. Perhaps not on the scale of an FBI agent, but I know what horrors lurk in the world, and I have directly seen the consequences. These are things I don’t talk about outside the circle of former coworkers who have been there themselves. No matter how statistically rare it is (and most abuse is by a family member), like all parents I fear for my own children. Being on what reminds me of 24/7 suicide watch for the first years of their lives — fretting about SIDS, the consumption of inedible objects, power outlets, and other infant dangers — is hard to forget. If something happened to them, I wouldn’t care about what laws or social norms impeded my ability to protect them, and I know the feeling of helplessness when you lack the resources or ability to save someone else’s child. These are powerful, emotional forces.
I fully understand the drive and motivations the law enforcement community has to maintain access to our devices. That access speeds up or breaks open cases. It enables police, at times, to better protect us from the worst the world has to offer. I understand how they can perceive Apple and Google as interfering with their ability to collect data they are legally entitled to access in the course of their duties. Data that could, at times, save lives.
But law enforcement needs to understand that technology companies aren’t trying to protect the bad guys, but stop them. That until iOS 8, I had to walk my clients through the iOS security loopholes that made it difficult to protect corporate and personal data. That such backdoors are already used to suppress free speech throughout the world, sometimes with fatal consequences. That without this encryption, we are all less secure.
Society is in the midst of a major upheaval powered by technology. One where the lines of privacy, civil rights, and the role of government are shifting as we fundamentally alter our social and communications structures. We need to decide if, as we make this transition, we will provide our governments pervasive access to all of our information, which simultaneously reduces our collective ability to defend ourselves from criminals. And we need to realize that if we err on the side of stronger inherent security, some criminals, even the worst of them, will occasionally get away.