This article originally appeared in TidBITS on 2016-01-20 at 8:41 a.m.
The permanent URL for this article is:
Include images: Off

Why Apple Defends Encryption

by Rich Mogull

The Intercept recently reported that Apple CEO Tim Cook, in a private meeting with White House officials and other technology leaders, criticized the federal government’s stance on encryption and technology back doors (see “Tim Cook Confronts the White House Over Encryption [1],” 14 January 2016). As it was a private meeting, we don’t know exactly what happened, and The Intercept is admittedly biased on this issue, but such statements would certainly align with Cook’s previous public positions [2]. This is just the latest of Apple’s spats with the U.S. government over encryption — I first wrote about them in “Apple and Google Spark Civil Rights Debate [3]” (10 October 2014).

Cook’s dustup with the White House prompted Daring Fireball’s John Gruber to ask [4]:

This came up during last night’s Republican primary debate — not about tech companies refusing to allow backdoors in encryption systems, but about Apple specifically. Tim Cook is right, and encryption and privacy experts are all on his side, but where are the other leaders of major U.S. companies? Where is Larry Page? Satya Nadella? Mark Zuckerberg? Jack Dorsey? I hear crickets chirping.

These aren’t nebulous questions from privacy activists or Apple enthusiasts. We are in the midst of fundamentally redefining the relationship between governments and citizens in the face of technological upheavals in human communications. Other technology leaders are relatively quiet on the issue because they lack the ground to stand on. Not due to personal preferences or business compromises, but because of their business models, and lack of demand from us, their customers.

(Apologies to our international readers, but this article is U.S.-centric since the issues vary depending on where you live. That said, a number of other governments, like those of the UK [5] and France [6], are dealing with exactly the same situation.)

Why Governments Want to Read Your Mind -- When we talk about encryption and privacy from government surveillance, we’re really discussing three separate, but related, issues:

Encryption plays a massive role in each of these issues, but it’s only part of the story.

Civilian law enforcement agencies that serve “the law,” not government leaders, are actually a relatively recent development in the history of the world. The concept is “policing by consent,” which comes from the Peelian Principles [7] of policing. Law enforcement is separate from the military and from intelligence agencies. We give police extraordinary, but not unlimited, powers to allow them to enforce the law and protect citizens. In the United States, multiple agencies at multiple levels (local, state, and federal) interoperate with the judiciary to create a series of checks and balances on power.

In America, law enforcement agencies at all levels have a long history of accessing personal information as part of their investigations. Police are accustomed to obtaining evidence from nearly any source, with the appropriate legal authority, via a process called “lawful access.” The police can crack a safe, tap your phone, read your mail, access your financial records, and more, as long as they have the right authority. That may require asking a judge for a warrant (to read your physical mail); in other situations, the information is available using techniques with lower evidentiary standards (tracking your car with GPS, since public movements are… public). Regardless, all these situations are within the framework of the law. There are even laws that mandate telecommunications companies create back doors for lawful access [8].

The problem for the police is that new technologies block their ability to access information they need (or at least think they need) to do their job. Mobile phones, for example, have become one of the best information sources in law enforcement history since they consolidate a suspect’s communications and personal data into one tidy package. Sure, police can get texts, call, and location histories through a phone carrier, but it’s much faster and easier to pull it from the phone, which also likely contains Facebook posts, encrypted iMessages, email conversations, and a lot more.

This is the first time in history we have civilian communications and information storage devices that law enforcement can’t access, even with a warrant. That’s a slight exaggeration, since you can be compelled to unlock your phone (your passcode can’t be tortured out of you, but a judge can toss you in jail until you give it up). And as I said, most — but not all — of the information on a phone is typically available in other places, but getting it elsewhere is far more time consuming than looking on the phone.

Law enforcement officers see strong encryption as a tool that impedes their ability to do their job, using tools they have never previously been denied.

Intelligence agencies are different. They aren’t supposed to monitor U.S. citizens (except for a few very narrow exceptions). Even the bulk data collection that’s come to light in recent years has limits and is predominantly focused on monitoring foreign communications, including those into and out of the United States. Intelligence agencies don’t enforce the law, they spy, on other nations and potential threats. But in recent decades they’ve faced a massive legal and logistical problem — a large portion of the technology they need to use relies on products and services that originate in the United States.

If intelligence agencies need to tap bad guy email communications in Europe, they often need to crack into the likes of Google, Microsoft, and other services. Or at least tap them internationally, since they are explicitly not allowed to tap domestically. But as those who understand how the Internet works know, the lines between domestic and international aren’t always clear.

And “tap” is a misnomer. Technology companies use encryption to protect information and transactions from attackers. That same encryption is powerful enough to impede intelligence agencies, or at least increase the costs for them to crack it. So they may know about some software vulnerabilities that let them break into systems. Or maybe they look at putting in a secret “back door.” Except there are no secrets, and every back door is a security vulnerability just waiting to be exploited [9]. Plus, deliberately introducing vulnerabilities or back doors in domestic systems is not only likely a violation of law, but also of the policies and operating procedures of the intelligence agencies themselves.

Both law enforcement and intelligence agencies face the same fundamental problem. Any tool that enables lawful access also enables unlawful access. There are no Golden Keys [10], only skeleton keys.


To top it all off, existing laws in many of these areas are often unclear or obsolete, at a time when the ability to access our devices and services is effectively akin to reading our minds. These fights over our online rights are fundamentally redefining the relationship between citizens and governments.

Why Apple Takes a Stance Where Others Don’t -- Apple is far from the first company to find itself in government crosshairs. BlackBerry, for example, was forced to decrypt communications for the Indian government in 2010 [11] or shut down operations in the country. Those same tools have since expanded and are used in other countries [12] to monitor BlackBerry devices.

Let’s go back to Gruber’s list: Google, Microsoft, Facebook, and Twitter. I’ll add Amazon and Samsung. In each case, these companies’ business models don’t give them the same freedom to speak out as Apple enjoys.

Google is fundamentally an advertising company that collects data on its users. That information can’t be encrypted so only the user can see it, since that would prevent Google from accessing it and using it for targeted advertising. Even removing the ad issue, some of Google’s services fundamentally won’t work without Google having access to the underlying data. Google is taking a stronger stance with Android encryption, at least on the technology side (slowly, because Google doesn’t control most Android hardware). But Google isn’t vocal about this since all its data is accessible with a warrant. It isn’t in the company’s interest to call attention to this fact. However, I do know that Google does whatever it can to prevent spying and other monitoring, such as encrypting all communications between its data centers.

Microsoft is fundamentally a software company. The firm already offers strong encryption for PCs (BitLocker), but it isn’t consumer friendly and Microsoft owns only a tiny fraction of the mobile market. Its biggest customers, corporations and governments, are long used to monitoring Microsoft platforms for legitimate enterprise security reasons. Of all the companies here, Microsoft is in the best position to back up Apple, but Microsoft has such a long history of working with, and selling to, government that the company shies away from public conflict on this issue. Outside of the public eye, Microsoft is currently being held in contempt of court [13] as the company battles the Department of Justice to protect user data stored overseas in a case that could define the future of cloud computing.

Facebook and Twitter? All data in social networks is accessible via lawful access (heck, most of it is effectively public anyway). Like Google, these companies are essentially ad platforms that need access to our data. While Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey may support strong encryption as individuals, there isn’t anything their companies can do about it on a practical level. Amazon? It doesn’t sell the hardware that matters, and as primarily a retailer, it isn’t in a position to do or say anything that will make a difference. (The exception is in Amazon Web Services, which makes extensive use of encryption, including government-proof options.) Samsung? Samsung is a foreign company that the U.S. government would be unlikely to take seriously.

Massive tech companies like Cisco, IBM, Oracle, and even HP simply aren’t in the right part of the market to advocate on behalf of consumers, even if they wanted to. And like Microsoft, they have a lot of government contracts.

All these companies place a high priority on the security of their products and services, but, for the most part, they can’t build things that allow us to control our information and keep it private.

And again, to be perfectly clear, any time you enable monitoring, you reduce privacy and security. Every back door is a security vulnerability. This is the technology equivalent to a law of physics, not some technical problem we haven’t solved yet.

Apple is nearly unique among technology firms in that it’s high profile, has revenue lines that don’t rely on compromising privacy, and sells products that are squarely in the crosshairs of the encryption debate. Because of this, everything Apple says about encryption comes from a highly defensible position, especially now that the company is dropping its iAd App Network.

Not everything Apple makes is immune from monitoring. The company must still comply with government requests for data that can’t be encrypted and lives on Apple servers, particularly for services like iCloud Mail, iCloud Drive, and iCloud Photo Library, all of which are subject to lawful access. Wherever possible, Apple uses strong encryption that even it can’t access, as long as the technologies and user experience align.

And Apple, like most of these companies, only provides data when all the right legal boxes are checked. To do otherwise would expose the company to lawsuits.

There’s probably even more to Apple’s stance on encryption than the company’s business model and desire to promote a competitive advantage. My opinion, without having ever talked with Tim Cook, is that this is at least partially social activism on his part. I suspect that this is an issue he cares about personally [14], and he has the soapbox of one of the most powerful and popular companies in the world under his feet.

We can hope that Larry Page, Satya Nadella, Mark Zuckerberg, Jack Dorsey, Jeff Bezos, and other tech CEOs will someday also speak out on these issues. But their personal feelings aside, they aren’t in the same position as Tim Cook and Apple to take a stand.

Now is the time when we get to decide if we have a right to privacy and security, and the limits of our government for the digital age. It won’t happen because of public statements by tech leaders. No, it’s up to us to make our opinions about online privacy and security known to our elected representatives, in order to determine the limits of policing (and protecting) by consent.

In fact, you have an opportunity to weigh in right now. A bill has been introduced in New York State that would ban the sale of smartphones within the state [15] unless they can be decrypted and unlocked by the manufacturer. It’s astonishingly misguided, and for those who want to express their disbelief that elected representatives could be so ignorant of technology (and geography), you can set up an account with the New York State Senate, vote against it, and even leave comments. The state of California has introduced an equally asinine law [16], though there isn’t any sort of feedback mechanism other than contacting your state representative.

Then, just sit back and wait for the next ignorant statement or misguided piece of legislation, because these issues aren’t going to be resolved easily, quickly, or definitively.