Thoughtful, detailed coverage of the Mac, iPhone, and iPad, plus the TidBITS Content Network for Apple consultants.

Apple and Google Spark Civil Rights Debate

One small feature in iOS 8, barely mentioned during the keynote at Apple’s Worldwide Developer Conference, has engendered a major civil rights controversy. Apple said that, starting with iOS 8, all Apple-provided apps would be encrypted with Data Protection, which entangles your passcode with a special ID embedded in your device, that not even Apple can recover.

The implications were clear for months, but sprang into the public consciousness only after Tim Cook released an open letter on Apple’s privacy stance, highlighting that the technology would nearly eliminate the possibility of Apple accessing data on your device, even if compelled to try by law enforcement or the government (see “Apple Goes Public on Privacy,” 24 September 2014). This was quickly followed by FBI Director James Comey stating, “What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law.”

Other law enforcement officials followed with their own condemnations. The chief of detectives for the city of Chicago even said, “Apple will become the phone of choice for the pedophile.” Ouch.

Before I go on, there are three things you need to know about me:

  • My day job is advising companies and governments, big and small, about how to improve their security. One of my specialties is mobile devices and another is cloud computing.

  • As an emergency responder (formerly a full-time paramedic, also with firefighting and mountain rescue experience), I have more than once had to tell parents that their child had died, often for easily avoidable reasons.

  • I have three small children of my own.

Perspective is important. As writers, we are always biased by our experiences, especially with emotional issues that pit our civil rights against fundamental fears for ourselves and others. We shouldn’t vilify law enforcement officials who see access to our phones as important to their mission to protect us, yet we also can’t allow low-frequency statistical events, however great the emotional impact, to create even greater, more generalized risks.

iPhones Have Long Frustrated Law Enforcement -- When the iPhone first came out, I was one of a group of analysts who advised enterprises to avoid it, largely due to security concerns. The original iPhone didn’t really have any security, but it also lacked apps. Since then, Apple has dramatically improved the resiliency of the platform from attacks, even if someone has physical possession of the device.

This level of security comes thanks to how Apple implemented Data Protection in iOS. Your iPhone is always encrypted, but in a way that is easy to circumvent. Data Protection enhanced the basic encryption by entangling your passcode (if you set one) with a unique identification number burned into the hardware of the device. There is no way to pull that code off the device, which dramatically increases the length of the “total” passcode that protects your encryption key.

One of the more effective techniques to crack phone encryption is to pull the data off the source device, and then brute-force attack it from larger, more powerful computers. By adding that device code into the mix, Apple makes it effectively impossible to crack iOS encrypted data on even the biggest computers, even if you can get the image off the device. Thus most forensics tools used by law enforcement (and criminals) have to crack the encryption on the iPhone (or iPad) itself, using the embedded processor that mixes the device code back in. However, this hardware is rate-limited, so once a passcode hits about six characters, it could take many years (even centuries) to break via brute force.

Data Protection has been around for at least three revisions of iOS devices, but for most of that time it protected very little: email (which the police can get from servers) and any apps that turned on the Data Protection API. Apple has never been able to access this data… for anyone. However, plenty of other data was exposed, including text messages, photos, app data that didn’t turn on Data Protection, contacts, and much more.

I detailed all of these limits in my iOS data security papers, because they were major security headaches for my enterprise clients. If you really wanted to secure data on an iOS device, you needed to be extremely careful and use all sorts of additional security controls. That’s why I traveled to China using only “clean” devices that didn’t have access to my normal accounts or data.

iOS 7 expanded Data Protection to all third-party apps (even those that don’t use the Data Protection API), and iOS 8 expanded it to all Apple-provided apps, including Messages, Camera, etc.

Apple didn’t deliberately lock out law enforcement to protect pedophiles, it added a much-needed security control to protect all of us from common criminals.

The Bane of Backdoors -- It shouldn’t be a surprise that some law enforcement officials consider it their right to access our phones. Since the dawn of civilian law enforcement, we have granted police investigative powers: the ability to access anything outside our heads, with a proper court order.

With probable cause, the police can read our mail, enter our homes, listen to our phone calls, and more. That’s how many of them view the world — not with totalitarian malice, but as a legal privilege entrusted to them to protect society. Get the warrant, get the order, get the information. It’s all part of the legal process.

But technology advances have created a modern society that is no longer so clean. There is a term in the defense world called “dual use.” It describes technologies that can be used for good or evil. As a ski patroller, I used explosives to trigger avalanches. The same explosives are used in construction and in some military munitions.

Law enforcement, especially federal law enforcement, has a history of desiring and imposing backdoors into technology. The Communications Assistance for Law Enforcement Act (CALEA) of 1994 requires all telecommunications equipment manufacturers to enable remote wiretapping for law enforcement in the hardware. But CALEA backdoors have also been abused by criminals and intelligence agencies.

Last year, the New York Times revealed that the Obama administration was on the verge of backing CALEA-II, which would require any Internet communications service to include a backdoor for direct law enforcement wiretapping. All chat programs, social networks, and even webcasting tools would have had to provide a secret entrance for law enforcement.

Even dismissing concerns over NSA monitoring, it is technologically impossible to build such backdoors without the possibility of abuse by attackers. All wiretapping interfaces are dual use, and subject to abuse. No one has ever created a monitoring technology immune from attack or misuse.

In a short-sighted attempt to maintain existing investigative capabilities, the FBI is pushing to reduce communications security for the entire Internet, which would also destroy the competitiveness of U.S. companies globally. Then again, Microsoft is currently being held in contempt of court for not providing U.S. law enforcement private email messages from a customer in Ireland, from a data center in Ireland, even though doing so would violate Irish law. It’s a court case that could decimate the global market for all U.S.-based cloud computing providers.

A Rusty Key -- On 3 October 2014, the Washington Post Editorial Board proposed Apple and Google build in a “Golden Key” to provide law enforcement, and only law enforcement with a court order, access to phones. That isn’t something I’m opposed to in concept — except it is technologically impossible.

I don’t know a single security expert or cryptographer who believes a secure golden key is viable. Especially not one that would be accessible to a range of law enforcement officials, over time, for various cases. It simply can’t be done. The closest equivalent is the fourteen holders (and seven backups) of the Internet DNS signing keys. That’s a system that works only due to their geographic distribution, and infrequent meeting requirements. A law enforcement equivalent could never handle ongoing law enforcement needs.

And that ignores the international implications of creating a backdoor to all phones, one that could be abused by any government.

Apple clearly sees its security as a competitive advantage, as I wrote, months before Tim Cook’s message. Especially since Google, by nature of its business model, can never maintain privacy as well as Apple, despite recent failures, no matter how strong its security capabilities. (Remember that privacy and security are not the same thing!) But all that marketing doesn’t eliminate the fact that Apple closed a long-known security flaw first, and leveraged the marketing later.

Attorney General Eric Holder claimed Apple and Google are “thwarting” law enforcement’s ability to stop child abuse. I have witnessed such abuse. Perhaps not on the scale of an FBI agent, but I know what horrors lurk in the world, and I have directly seen the consequences. These are things I don’t talk about outside the circle of former coworkers who have been there themselves. No matter how statistically rare it is (and most abuse is by a family member), like all parents I fear for my own children. Being on what reminds me of 24/7 suicide watch for the first years of their lives — fretting about SIDS, the consumption of inedible objects, power outlets, and other infant dangers — is hard to forget. If something happened to them, I wouldn’t care about what laws or social norms impeded my ability to protect them, and I know the feeling of helplessness when you lack the resources or ability to save someone else’s child. These are powerful, emotional forces.

I fully understand the drive and motivations the law enforcement community has to maintain access to our devices. That access speeds up or breaks open cases. It enables police, at times, to better protect us from the worst the world has to offer. I understand how they can perceive Apple and Google as interfering with their ability to collect data they are legally entitled to access in the course of their duties. Data that could, at times, save lives.

But law enforcement needs to understand that technology companies aren’t trying to protect the bad guys, but stop them. That until iOS 8, I had to walk my clients through the iOS security loopholes that made it difficult to protect corporate and personal data. That such backdoors are already used to suppress free speech throughout the world, sometimes with fatal consequences. That without this encryption, we are all less secure.

Society is in the midst of a major upheaval powered by technology. One where the lines of privacy, civil rights, and the role of government are shifting as we fundamentally alter our social and communications structures. We need to decide if, as we make this transition, we will provide our governments pervasive access to all of our information, which simultaneously reduces our collective ability to defend ourselves from criminals. And we need to realize that if we err on the side of stronger inherent security, some criminals, even the worst of them, will occasionally get away.

 

Backblaze is unlimited, unthrottled backup for Macs at $5/month.
Web access to files means your data is always available. Restore
by Mail allows you to recover files via a hard drive or USB.
Start your 15-day trial today! <https://www.backblaze.com/tb>
 

Comments about Apple and Google Spark Civil Rights Debate
(Comments are closed.)

"However, this hardware is rate-limited, so once you hit about six characters in your passcode, it can take many years (even centuries) to break the code via brute force."

If you have enabled "Erase Data" under Passcode, doesn't it erase the iPhone data (or throw away the encryption key) after 10 incorrect attempts to enter a passcode? Or does this allow brute-force code break attempts?
Michael E. Cohen  An apple icon for a TidBITS Staffer 2014-10-11 12:46
The ten tries are handled by iOS, and assumes that someone is just entering the passcode attempts on the lock screen. This feature is designed to thwart common thieves.

What Rich is describing involves making the code-break attempts directly on the embedded processor, something that the law enforcement forensic specialists do in a lab with specialized hardware tools.
Thank you.
Thank you for a much needed article!

You show a lot of restraint dealing with the excessive demands (and idiotic arguments) of the law enforcers you mention. They are acting as if we were all presumed guilty unless proven otherwise.

There are very good reasons why the "presumed innocent" principle has been around for thousands of years.

The US used to be the land of the free. Giving that up for a vague promise of increased security is like committing suicide out of the fear of dying.
ronman  2014-10-14 00:44
An excellent, well written article that shows a clear understanding of both sides of the all of the issues.
Chap Harrison  2014-10-14 00:45
Note - having trouble parsing the final sentence. Should "knowing that" be either omitted or replaced with "then"?
Adam Engst  An apple icon for a TidBITS Staffer 2014-10-14 12:17
My fault - I was editing late last night before pushing the issue out and I left two extra words in. Fixed now.
They have only themselves to blame!

Where was the FBI and all the oh-so-honest-and-well-intentioned law enforcement officials' objection to Patriot Act, mass wiretapping, torture, abduction and even murder of US citizens during all the post-9/11 hysteria?

Where did they warn that government abuse of such powers would in the future limit their capabilities to protect citizens?

Nowhere. The result is nobody trusts the government because we know we've been lied to before and had our constitutional rights violated repeatedly.

You reap what you sow. No use crying about it now.
Also, why are they going after Google or Apple? If they can prove probable cause in front of a judge they'll get a court order. If the phone owner/user refuses to open the device to the authorities they will be in contempt and go to jail. Simple as that.

Asking Apple to somehow give them backdoor access is like the police commandeering a door manufacturer to break down a door because the home owner didn't leave it wide open.
john Springer  2014-10-14 02:47
Bingo. Right answer.
Also, excellent point by Rich here.

Inserting such backdoors will probably threaten the lives of thousands of activists in authoritarian states all over the world.

How is it that their lives are worth sacrificing just because the FBI director has painted some vague hysterical scenario where only his backdoor will save your child from an evil perp?
john Springer  2014-10-14 02:49
This is the most informative article on phone encryption issues I have ever read. I hope it gets widely read in legal and governmental bodies.
It's all FUD. After reading everything I can find on what Snowden released I don't believe any of them. I think it's all talk on all sides.

I appreciate the opinion piece. It's a rational one that no one seems to be making. How convenient for law enforcement to use the most evil thing as their example when virtually millions of accounts have been hacked to date. There's your security hole. It's highly MORE likely that someone who's cracking accounts is also involved in other nefarious deeds.
Bud Brown  2014-10-14 08:50
Let's not lose sight of the fact that law enforcement has always needed a warrant to look at your "stuff."

The beauty of this system is they can't use the sham FISA courts; they have to appear before an actual judge and get a legitimate warrant. Then, with that warrant, they can compel you to open up your iPhone for their inspection.
Anonymous  2014-10-14 09:01
Excellent article. It's particularly fortunate that the author's skill set supports the clear headed contentions of this piece.
Thanks for publishing.
paulguinnessy  2014-10-14 10:01
I haven't seen the feds pull out one single case in which they could claim that access to a phone helped them find a victim (they did bring one up but then back tracked on it a few hours later).

If Apple and Google complied with the wishes of government agencies and built a back door for them, this won't change anything for the better.

Any competent developer can build a messaging app with encryption. Pedophiles and terrorists would just build them and use them.

The end result is criminals still get encrypted communication, while law abiding citizens have their data accessible by law enforcement and hackers who can break the back door.
Thanks for this opinion piece. We need more articles like this, to help debunk the so-called "protectors". Who will protect us from the protectors?

For more on these topics, I recommend Glenn Greenwald's "No Place to Hide".
Alan Sanders  2014-10-15 20:03
As far as I'm concerned, proponents of the "I don't have anything to hide" attitude toward gov't surveillance can go jump in a lake. In a democracy, it's the gov't that is accountable to the people—not the people who are accountable to the gov't. If we do not in fact live in a democracy, it's about time we stopped pretending that we do! The American people need to start standing up for their rights instead of burying their heads in Facebook and Twitter.