Thoughtful, detailed coverage of the Mac, iPhone, and iPad, plus the best-selling Take Control ebooks.

Why Apple Defends Encryption

The Intercept recently reported that Apple CEO Tim Cook, in a private meeting with White House officials and other technology leaders, criticized the federal government’s stance on encryption and technology back doors (see “Tim Cook Confronts the White House Over Encryption,” 14 January 2016). As it was a private meeting, we don’t know exactly what happened, and The Intercept is admittedly biased on this issue, but such statements would certainly align with Cook’s previous public positions. This is just the latest of Apple’s spats with the U.S. government over encryption — I first wrote about them in “Apple and Google Spark Civil Rights Debate” (10 October 2014).

Cook’s dustup with the White House prompted Daring Fireball’s John Gruber to ask:

This came up during last night’s Republican primary debate — not about tech companies refusing to allow backdoors in encryption systems, but about Apple specifically. Tim Cook is right, and encryption and privacy experts are all on his side, but where are the other leaders of major U.S. companies? Where is Larry Page? Satya Nadella? Mark Zuckerberg? Jack Dorsey? I hear crickets chirping.

These aren’t nebulous questions from privacy activists or Apple enthusiasts. We are in the midst of fundamentally redefining the relationship between governments and citizens in the face of technological upheavals in human communications. Other technology leaders are relatively quiet on the issue because they lack the ground to stand on. Not due to personal preferences or business compromises, but because of their business models, and lack of demand from us, their customers.

(Apologies to our international readers, but this article is U.S.-centric since the issues vary depending on where you live. That said, a number of other governments, like those of the UK and France, are dealing with exactly the same situation.)

Why Governments Want to Read Your Mind -- When we talk about encryption and privacy from government surveillance, we’re really discussing three separate, but related, issues:

  • Our right to use technology that keeps our data and activity private from law enforcement agencies.

  • The legal right and capability of our intelligence agencies to monitor our online activities and communications.

  • Intelligence and law enforcement agencies either mandating or creating (legal or not) “back door” access mechanisms in commercial products and services to monitor activity.

Encryption plays a massive role in each of these issues, but it’s only part of the story.

Civilian law enforcement agencies that serve “the law,” not government leaders, are actually a relatively recent development in the history of the world. The concept is “policing by consent,” which comes from the Peelian Principles of policing. Law enforcement is separate from the military and from intelligence agencies. We give police extraordinary, but not unlimited, powers to allow them to enforce the law and protect citizens. In the United States, multiple agencies at multiple levels (local, state, and federal) interoperate with the judiciary to create a series of checks and balances on power.

In America, law enforcement agencies at all levels have a long history of accessing personal information as part of their investigations. Police are accustomed to obtaining evidence from nearly any source, with the appropriate legal authority, via a process called “lawful access.” The police can crack a safe, tap your phone, read your mail, access your financial records, and more, as long as they have the right authority. That may require asking a judge for a warrant (to read your physical mail); in other situations, the information is available using techniques with lower evidentiary standards (tracking your car with GPS, since public movements are… public). Regardless, all these situations are within the framework of the law. There are even laws that mandate telecommunications companies create back doors for lawful access.

The problem for the police is that new technologies block their ability to access information they need (or at least think they need) to do their job. Mobile phones, for example, have become one of the best information sources in law enforcement history since they consolidate a suspect’s communications and personal data into one tidy package. Sure, police can get texts, call, and location histories through a phone carrier, but it’s much faster and easier to pull it from the phone, which also likely contains Facebook posts, encrypted iMessages, email conversations, and a lot more.

This is the first time in history we have civilian communications and information storage devices that law enforcement can’t access, even with a warrant. That’s a slight exaggeration, since you can be compelled to unlock your phone (your passcode can’t be tortured out of you, but a judge can toss you in jail until you give it up). And as I said, most — but not all — of the information on a phone is typically available in other places, but getting it elsewhere is far more time consuming than looking on the phone.

Law enforcement officers see strong encryption as a tool that impedes their ability to do their job, using tools they have never previously been denied.

Intelligence agencies are different. They aren’t supposed to monitor U.S. citizens (except for a few very narrow exceptions). Even the bulk data collection that’s come to light in recent years has limits and is predominantly focused on monitoring foreign communications, including those into and out of the United States. Intelligence agencies don’t enforce the law, they spy, on other nations and potential threats. But in recent decades they’ve faced a massive legal and logistical problem — a large portion of the technology they need to use relies on products and services that originate in the United States.

If intelligence agencies need to tap bad guy email communications in Europe, they often need to crack into the likes of Google, Microsoft, and other services. Or at least tap them internationally, since they are explicitly not allowed to tap domestically. But as those who understand how the Internet works know, the lines between domestic and international aren’t always clear.

And “tap” is a misnomer. Technology companies use encryption to protect information and transactions from attackers. That same encryption is powerful enough to impede intelligence agencies, or at least increase the costs for them to crack it. So they may know about some software vulnerabilities that let them break into systems. Or maybe they look at putting in a secret “back door.” Except there are no secrets, and every back door is a security vulnerability just waiting to be exploited. Plus, deliberately introducing vulnerabilities or back doors in domestic systems is not only likely a violation of law, but also of the policies and operating procedures of the intelligence agencies themselves.

Both law enforcement and intelligence agencies face the same fundamental problem. Any tool that enables lawful access also enables unlawful access. There are no Golden Keys, only skeleton keys.

Thus:

  • While we have a legal right not to incriminate ourselves to law enforcement, law enforcement has historically had access to anything we have ever recorded or communicated. Until now.

  • Intelligence agencies are tasked with spying on foreigners in order to protect us, but the lines between “foreign” and “domestic” are blurrier than ever.

  • Back door access for law enforcement (and, sometimes, intelligence agencies) has previously been legally mandated. But every back door is not only a vulnerability, but also potentially a tool for abuse, from criminals or hostile governments.

To top it all off, existing laws in many of these areas are often unclear or obsolete, at a time when the ability to access our devices and services is effectively akin to reading our minds. These fights over our online rights are fundamentally redefining the relationship between citizens and governments.

Why Apple Takes a Stance Where Others Don’t -- Apple is far from the first company to find itself in government crosshairs. BlackBerry, for example, was forced to decrypt communications for the Indian government in 2010 or shut down operations in the country. Those same tools have since expanded and are used in other countries to monitor BlackBerry devices.

Let’s go back to Gruber’s list: Google, Microsoft, Facebook, and Twitter. I’ll add Amazon and Samsung. In each case, these companies’ business models don’t give them the same freedom to speak out as Apple enjoys.

Google is fundamentally an advertising company that collects data on its users. That information can’t be encrypted so only the user can see it, since that would prevent Google from accessing it and using it for targeted advertising. Even removing the ad issue, some of Google’s services fundamentally won’t work without Google having access to the underlying data. Google is taking a stronger stance with Android encryption, at least on the technology side (slowly, because Google doesn’t control most Android hardware). But Google isn’t vocal about this since all its data is accessible with a warrant. It isn’t in the company’s interest to call attention to this fact. However, I do know that Google does whatever it can to prevent spying and other monitoring, such as encrypting all communications between its data centers.

Microsoft is fundamentally a software company. The firm already offers strong encryption for PCs (BitLocker), but it isn’t consumer friendly and Microsoft owns only a tiny fraction of the mobile market. Its biggest customers, corporations and governments, are long used to monitoring Microsoft platforms for legitimate enterprise security reasons. Of all the companies here, Microsoft is in the best position to back up Apple, but Microsoft has such a long history of working with, and selling to, government that the company shies away from public conflict on this issue. Outside of the public eye, Microsoft is currently being held in contempt of court as the company battles the Department of Justice to protect user data stored overseas in a case that could define the future of cloud computing.

Facebook and Twitter? All data in social networks is accessible via lawful access (heck, most of it is effectively public anyway). Like Google, these companies are essentially ad platforms that need access to our data. While Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey may support strong encryption as individuals, there isn’t anything their companies can do about it on a practical level. Amazon? It doesn’t sell the hardware that matters, and as primarily a retailer, it isn’t in a position to do or say anything that will make a difference. (The exception is in Amazon Web Services, which makes extensive use of encryption, including government-proof options.) Samsung? Samsung is a foreign company that the U.S. government would be unlikely to take seriously.

Massive tech companies like Cisco, IBM, Oracle, and even HP simply aren’t in the right part of the market to advocate on behalf of consumers, even if they wanted to. And like Microsoft, they have a lot of government contracts.

All these companies place a high priority on the security of their products and services, but, for the most part, they can’t build things that allow us to control our information and keep it private.

And again, to be perfectly clear, any time you enable monitoring, you reduce privacy and security. Every back door is a security vulnerability. This is the technology equivalent to a law of physics, not some technical problem we haven’t solved yet.

Apple is nearly unique among technology firms in that it’s high profile, has revenue lines that don’t rely on compromising privacy, and sells products that are squarely in the crosshairs of the encryption debate. Because of this, everything Apple says about encryption comes from a highly defensible position, especially now that the company is dropping its iAd App Network.

Not everything Apple makes is immune from monitoring. The company must still comply with government requests for data that can’t be encrypted and lives on Apple servers, particularly for services like iCloud Mail, iCloud Drive, and iCloud Photo Library, all of which are subject to lawful access. Wherever possible, Apple uses strong encryption that even it can’t access, as long as the technologies and user experience align.

And Apple, like most of these companies, only provides data when all the right legal boxes are checked. To do otherwise would expose the company to lawsuits.

There’s probably even more to Apple’s stance on encryption than the company’s business model and desire to promote a competitive advantage. My opinion, without having ever talked with Tim Cook, is that this is at least partially social activism on his part. I suspect that this is an issue he cares about personally, and he has the soapbox of one of the most powerful and popular companies in the world under his feet.

We can hope that Larry Page, Satya Nadella, Mark Zuckerberg, Jack Dorsey, Jeff Bezos, and other tech CEOs will someday also speak out on these issues. But their personal feelings aside, they aren’t in the same position as Tim Cook and Apple to take a stand.

Now is the time when we get to decide if we have a right to privacy and security, and the limits of our government for the digital age. It won’t happen because of public statements by tech leaders. No, it’s up to us to make our opinions about online privacy and security known to our elected representatives, in order to determine the limits of policing (and protecting) by consent.

In fact, you have an opportunity to weigh in right now. A bill has been introduced in New York State that would ban the sale of smartphones within the state unless they can be decrypted and unlocked by the manufacturer. It’s astonishingly misguided, and for those who want to express their disbelief that elected representatives could be so ignorant of technology (and geography), you can set up an account with the New York State Senate, vote against it, and even leave comments. The state of California has introduced an equally asinine law, though there isn’t any sort of feedback mechanism other than contacting your state representative.

Then, just sit back and wait for the next ignorant statement or misguided piece of legislation, because these issues aren’t going to be resolved easily, quickly, or definitively.

 

READERS LIKE YOU! Support TidBITS by becoming a member today!
Check out the perks at <http://tidbits.com/member_benefits.html>
Special thanks to Michael Curtis, Nathan Schwam, Don Foy, and Joseph
Fink for their generous support!
 

Comments about Why Apple Defends Encryption
(Comments are closed.)

Jeff Porten  An apple icon for a Friend of TidBITS 2016-01-20 19:44
One quibble: for 150 years, communications and storage were protected by the Fourth Amendment, and the relatively harder process of getting access to paper. Electronic surveillance started with copper wire, and the technological possibility of tapping it.

Between the 1930s and 1970s, law enforcement rhetoric changed from "we do this because it's possible" to "we do this because it's impossible to do our jobs without it." The latter is what's firmly entrenched today, but historically, a *political* argument could be made, "do what you can, within the limits of what's possible."

Resurrecting that argument is probably going to be the legal and political battle of the next 20 years.
Ian Crew  2016-01-20 22:09
The thing is that a law like this would only catch the stupid criminals. These laws always treat encryption like some big, mysterious thing that's exclusively under the control of the device manufacturers. As any of us who have ever taken an introductory computer science theory course at the university level are very aware, nothing could be further from the truth: the mathematical theory and algorithms for writing secure encryption are really pretty simple, and are widely and well known. See https://en.wikipedia.org/wiki/RSA_(cryptosystem), for example. So if laws like this are passed, it'd be a simple matter for someone else--especially someone beyond the jurisdiction of these laws--to create secure messaging and storage apps that don't have a back door.

This is yet another example of how the lack of technical knowledge amongst our lawmakers is a big problem.
Just a small addition to your statement, it's not that people can create these apps, they have created them already (and they're not that hard to create, either):
http://www.kitguru.net/gaming/security-software/jon-martindale/daesh-has-its-own-encrypted-chat-app/

Maybe the intelligence agencies should worry more about putting backdoors on those...
Norm M  2016-01-21 17:47
This is the main argument, in my opinion. The bad guys can always encrypt securely, so adding backdoors just removes our privacy for nothing! The article should have made this point! Even if encryption was banned from the app store (a terrible idea), criminals can still get apps that encrypt without backdoors loaded onto their phone by someone using Xcode (or by jailbreaking).
Adam Engst  An apple icon for a TidBITS Staffer 2016-01-22 10:28
I'll add a bit about that - the article isn't so much about why back doors are stupid as why Apple is the one defending encryption in public.
Don't forget the other big player in this controversy: The Electronic Freedom Foundation. EFF was created by some of the most important pioneers in the industry, and they do a great service.
Even before electronic communication, letters could be sent in code. So it's not really the first time that police can't access communications without a warrant.
Giorgio from Italy  2016-01-21 10:27
The is an elephant missing in the matter: encryption protects from the government but also from others, like competitors, criminals, data miners etc...
Compromising encryption for tha sake of government needs will compromise it also for criminal needs.
How difficult would be for a criminal to exploit the same hole government will use to decrypt data?
Is that what we want, an insecure security?
Michael  2016-01-21 11:44
So when a backdoor is breached by a nefarious party, who's going to be held accountable and liable? Is the governing body that mandated it going to be stand up and take blame? Governments are highly hypocritical agencies and I foresee them taking Apple to task for the breach. The same is happening in the EU now as they look back on poorly written tax laws and want to point blame at those that benefited from it rather than chastise those that wrote it.
AnoneemousOne  2016-01-21 16:18
You know something is wrong when the US Government tries to sell the notion that to defeat terrorist, the first solution is the take privacy and digital security away from Americans.
Darren P Meyer  2016-01-21 18:05
There's also a cultural component. LE has generally had access to things written or recorded, but not informal conversations (for practical reasons—there's no record of them). Now our informal conversations are often saved in text messages. LE's technical abilities are at odds with our cultural expectations about ephemeral conversation.
Rich,

I think it's a strategic error to lump encrypted *storage* and encrypted *communications* into the same discussion. There's a long history of case law (and the 5th amendment) backing up the right to encrypt your own data, but as soon as you transmit that data over a network that is even partially funded through tax dollars, any expectation of privacy goes right out the window (at least, that's the way I've always behaved). The courts have a right to issue and execute warrants for such data. If you want privacy, build your own secure network to transmit and receive encrypted data. Just be sure it doesn't make use of any resource that belongs to "we the people" (radio frequencies, wires, etc.)
Nicholas Barnard  2016-01-22 20:02
Well. If data is encrypted in flight the gov't is more than welcome to issue a warrant to obtain the encrypted data, but it's not decrypted (then potentially encrypted) until it's on a device tha people have locked.

As was said above, letters in the past may have been written in code.
Tommy Craft  2016-01-22 13:10
This is a great article, and I can't wait to share it amongst my circle of friends. I did have one quibble though:

From the article: "...since you can be compelled to unlock your phone (your passcode can’t be tortured out of you, but a judge can toss you in jail until you give it up)."

However, that isn't true. Per the Virginia State Court: "...if the phone has a “touch-to-unlock” feature, suspects must use their finger or thumb to unlock the device (or otherwise provide a fingerprint that will let police do so) but, thanks to the Fifth Amendment, they can’t be compelled to turn over the phone’s passcode too." -https://gigaom.com/2014/11/04/suspect-must-use-finger-to-unlock-phone-as-debate-shifts-over-device-privacy/

I seem to recall the EFF releasing a similar statement, that 5th amendment protection extends to the pass code, but not the fingerprint.
Nicholas Barnard  2016-01-22 20:05
I was also going to say the same thing.

It'd be nice to be able to tweak the requirements for requiring a passcode. AFAIK it's required at startup, when the device hadn't been unlocked for three days, and is there a limit on how many bad finger print reads are required?
Simon  An apple icon for a TidBITS Contributor 2016-01-23 01:33
I can't really see why one would make such a distinction. The fingerprint is just a more convenient way to enter the passcode. If you cannot compel someone to give their passcode, why should you be able to force their finger onto the reader?

Self-incrimination clauses as in the 5th amendment were introduced in many countries in order to (among others) ensure no torture be applied. The idea that people could have their fingers forcefully held to the fingerprint reader against their will in order to produce evidence against themselves brings us eerily close to those times again though, doesn't it?
Sally Shears  2016-02-18 16:14
Yes, it's a small number, on mine it's four bad finger touches. After that, phone requires the passcode.
Stephen Duncan  2016-01-26 21:24
They already have the right to take your fingerprints on paper or a scanner. The phone is no different.
Simon  An apple icon for a TidBITS Contributor 2016-01-27 15:29
It is absolutely different.

They have the right to take fingerprints for identification purposes.

But forcing your finger onto the Touch ID reader has nothing to do with identification. It's about compelling you to produce evidence against yourself. And that's something the 5th amendment explicitly protects you from.
The secret services spying schemes have nothing to do with terrorists or criminals and never have. The shere scale of them shows this. Logic dictates that the most common subject spied upon is the intended target, and that's ordinary civilians. This should be glaringly obvious.

Terrorists and criminals are rarely considered a threat to a state (although they are of course presented as such), but its civilians are. So states have always wanted to know what their subjects were thinking and whether any threats could emerge from that. States of old try to suppress freedom of expression so ideas can not reach critical mass. When that doesn't work anymore, they switch to managing freedom of expression. When that doesn't work anymore, they switch to managing people's ability to come to an informed opinion. In order to be able to do that you have to know what people think in detail. In this day and age that means Big Data to the rescue!

9/11 removed the last vestiges of protection from that.
David Byrum  An apple icon for a TidBITS Supporter 2016-01-25 22:01
Thanks for an informative and well written article about a timely topic. As others have said I too will be passing this along to friends to read and consider. In this day and age, it is a bit uplifting to read an articulate and factual article! Thanks for that!
Adam Engst  An apple icon for a TidBITS Staffer 2016-01-26 09:23
Thanks for the kind words!
William W Morgan  2016-01-29 17:48
It is the agreement of Homo sapiens that it is okay to control the life and property of others that is the cause of all our societal problems. It requires individuals to take actions inconsistent with the laws of nature under which we evolved as a thinking specie.

It requires a "system of laws" telling what an individual may do, may not do and must do. And when this creates problems the "solution" is "more laws, 'better' laws".

There is only 2 ways this can end. One is we stop making those unnatural laws. But why? Who will decide? What about existing laws? Does a thinking specie really need such laws?

Well, only one really. And that simply and understanding. That it is best, it is in individual self-interest that we try to avoid interfering in and controlling the lives of others. Since, once that starts there is no end. Better to spend the resources of our life doing those things that make a good and better life for yourself, without doing this things that interfere in this process for others.

The other possibility is that we continue with our current system and, eventually, all human actions is prescribed. Thinking not required, not allowed.

What can be the purpose of such a life? Obey the rules and the Ruler? Is this the destiny of all thinking species? The result of 3.8 billion years of evolution from single-cell life to multi-trillion cell thinking life?
B. Jefferson Le Blanc  2016-01-30 10:43
Benjamin Franklin foresaw this problem over 250 years ago (before the United States was born):

"They that can give up essential liberty
to obtain a little temporary safety
deserve neither liberty nor safety."

The context was different but the principle is the same. As governments have grown larger and more powerful, tyranny has become ever more subtle and insidious. And I'm not talking black helicopters, here, either. The current highly unequal state of the criminal justice system in the US is a consequence of exactly the kind of compromises Franklin was talking about. Fear of crime turned out to be fear of Black crime and laws were fashioned by our elected officials that effectively punish African Americans far more severely than whites for similar offenses. This also has to do with how the laws have been enforced, with the built-in but rarely acknowledged racial bias of the ruling majority. Nowhere is this bias more evident than in the now well publicized pattern of police violence against unarmed black civilians.

Foreign sponsored terrorism is the the tool of choice for fear mongers today and we are seeing demands for harsh and unequal justice in the treatment of immigrants, both legal and illegal. Fear is a natural human response to threats both actual and perceived and it has always been used by demagogues to enhance their own power and influence.

However, not everyone who tries to exploit our fears and anxieties does so for obvious personal gain. In the overgrown security industrial complex within the US government many simply cannot see the forest for the trees. To mix metaphors, to them every problem is a nail and their tool of choice is a hammer. Unfortunately, among those well insulated bureaucrats, checks and balances are feeble at best. When top government officials can claim that waterboarding is not torture, all our freedoms hang by a thread.

So in this instance, in my opinion, Tim Cook is on the side of the angels. And, with all due respect to the EFF, we should be grateful to have such a titan of industry on our side, with the wealth and influence of the worlds largest and most profitable corporation behind him. Long may he serve.
Great perspectives in this blog, thanks Rich…
It’s hard not to view this court order as a follow up to the technology leaders with the White House over encryption last year. When you have a single company that strongly disagree with building backdoor to its software, the government will compel such a company to do just that. What makes such action possible is pretty much summed up in John Gruber’s statements, quoted in your article. The fact of the matter is that no other technology companies protested at the referenced meeting.
You are correct about most companies business model does not permit encryption of customers’ data. Implementing end-to-end encryption by default for all, or even most, user data streams would conflict with the advertising model and presumably curtail revenues. Apple might be the only company with different business model, hence its strong support for their device encryption.
One would think that Microsoft shares this business model with Apple, due to their market share, but that’s not the case. They’ve jumped on the advertisement revenue stream that’s quite evident with their latest version of Windows. While Google, Facebook, Twitter, etc., currently generate most of the user data stream from users’ browsing habits, Windows 10 positioned to generate user data stream from users’ computer usage and to become #1 in this respect. It’s not a new business model, Google have been doing that for long with Android, Chrome Book and Chrome browser. Microsoft is just following Google’s lead.
Microsoft has also retrofitted Windows 7 and 8.x with the “telemetry” functions of Windows 10, via “important updates”. The telemetry function calls home rather frequently to Microsoft servers, seemingly built on the “command-and-control server” model a.k.a. puppet masters that govern malware. Maybe the model in this case is intended to control people, instead of malware. In either case, Windows 10 is probably a wet dream for law enforcement agencies.
It’s not hard to see that Apple is the last company that didn’t give in to government’s constant pressure, while others yielded with little or no resistance. That makes Apple very vulnerable to future actions by the courts and various government agencies such as IRS, DOJ, SEC, etc. Tim Cook may have strong believes in privacy, but it remains to be seen if the board of executives, stock holder, etc. will believe the same when Apple’s income statement drop substantially, due to various pending cases. In another word, the beating will continue even if Apple will fight the current court order and win. And that’s a big if…
Jose Jimenez  2016-02-18 11:13
Do people really believe that Apple is not cooperating fully with the US government (and other governments) when it comes to providing Apple customer device data? Any claims otherwise by Apple are just a smoke screen to make it appear that Apple has its customers' interests at heart.

Apple CEO Tim Cook's recent comments about Apple's reluctance to assist the US government to decode the contents of a specific iPhone could very easily be just scripted per US government demands. Apple could easily work with the US government in a covert fashion and the Apple-US government alliance would provide the necessary stories to avert disclosure of such an alliance.

The US government has a tremendous amount of control over companies located in the US. If a company such as Apple really refuses to do something for the US government the government can put a lot of under the table pressure on the company. For example, the US government could easily delay FCC certification for Apple devices which would stop the introduction of these devices and cost Apple a lot of money. Apple executives could also be audited by the IRS.

So, please don't believe what companies like Apple say about protecting their customers' device data since their only real interest is the company and not its customers.
Or, you know, maybe uphold a $450 million settlement against Apple in an ebook case. Like, the same day Cook posts his open letter. That kind of pressure?