Last week Apple published an Open Letter on Privacy, in which CEO Tim Cook emphasized how Apple products are designed with customer privacy as a top priority. The letter and accompanying site detail some of Apple’s privacy-related practices and technologies, as well as the policies that the company mandates and follows with advertisers, partners, and governments. Apple also posted the same content to its various international sites. It’s worth a look.
In part, the site and the letter are a public relations move. Apple has taken a flogging in recent weeks in the wake of attackers accessing several celebrities’ iCloud photos. (Apple claims the attacks are due to compromised passwords and security questions, not inherent security flaws, see “iCloud Flaw Not Source of Celebrity Photo Theft,” 2 September 2014.) As a result, Apple is currently trying to reposition itself as a company worthy of consumers’ trust. Of course, the site also highlights new privacy-related technologies in iOS 8: it’s no coincidence this open letter was published alongside a massive Apple product launch.
At one level, the site shows how Apple is in the same boat as other technology companies: Apple is constantly storing and processing sensitive data for hundreds of millions of people, and must comply with lawful government orders to hand over information. However, Apple’s privacy pages also highlight how it is different from other technology giants, primarily because Apple earns most of its money selling products, rather than by selling advertisers access to its customers.
Do these differences matter to you? The answer depends on how you (and those you know) use technology and online services.
A few key points emerge:
- Apple believes users should control how their information is collected and used. One way this manifests in iOS is requiring explicit permission before Apple (or anyone else) can collect data — and users can later revoke those permissions. The stance is a bit risky: pesky prompting is a bad user experience and people quickly learn to click (or tap) through alerts without reading them. (Think: when did you last read the iTunes End User License Agreement?) However, at least users can review permissions granted to iOS apps and revoke them if they like. Not comfortable with a Twitter client tapping into your photos? Deny it access in Settings. Same if you don’t want Siri knowing your location.
- Apple collects only information it believes it needs for a good user experience, anonymizes it when it can, and retains as little of that information as possible. If user data must be stored, where possible Apple encrypts it (say, on iCloud) and does not associate it with Apple IDs (for instance, Siri). If Apple no longer needs the information to provide services, it deletes the data. Apple also repeatedly emphasizes that it does not (and, in many cases, cannot) scan messages, photos, email, or documents on its services for any purpose, including compiling marketing profiles, mapping social or business connections, or selling products.
Apple tries to limit ways nefarious (or merely unscrupulous) app makers and third parties can abuse its products and services. For instance, Apple contractually prohibits app makers from collecting iOS device Advertising IDs from users who limit ad tracking. (Further, iOS users can reset their Advertising ID at any time.) While Apple cannot police the practices of every third-party developer, other things are within the company’s control. In iOS 8, Apple is randomizing the internal identifiers used by Wi-Fi hardware. Why? Those identifiers are disclosed every time a device connects to a Wi-Fi network (or even scans for available networks); marketers, hotspot operators, and even governments have used those identifiers to track individuals without consent.
If users set a passcode, iOS devices encrypt most user information they store. Apple introduced hardware encryption back in 2009 on the iPhone 3GS: all users had to do was set a passcode. Although that initial effort had severe shortcomings (see “iPhone 3GS Hardware Encryption Easy to Circumvent,” 7 August 2009), Apple has continued to improve on-device data protection. With iOS 8, Apple claims it is “not technically feasible” even for Apple to retrieve much information stored on a device protected with a passcode. This capability shouldn’t be viewed as a privacy panacea — attackers (and governments) can still brute-force passcodes to access data if they can get their hands on your iOS device. However, it does take Apple out of the business of extracting data from iOS 8 devices if law enforcement goes to the company with a warrant and someone’s iPhone. This device encryption has already raised concerns amongst law enforcement agencies. Inaccessible on-device encryption makes mobile technology more appealing to criminals, and the encryption could also impede time-sensitive investigations like abductions.
Apple says it does not allow any government agency from any country to access its servers “and never will.” Leaving aside whether agencies like the NSA already have clandestine access, the claim is precarious since the company operates in many countries — including the United States — that can legally compel such access. Apple just opened an iCloud data center in mainland China, subcontracting through state-controlled China Telecom. “Never” is a strong word: it will be interesting to see how Apple responds to any government demands for access to its servers. After all, not even Google will build data centers in China.
How Apple Compares — Apple seeks to differentiate its products and services from companies like Google and Microsoft on one primary point: selling access to advertisers. Google algorithmically monitors its users’ email, Web searches, contacts, and other online activity to compile information for advertisers. (Yahoo and Microsoft do much the same, although Microsoft no longer scans email.) Google users can tailor their ad preferences for a bit of control over what ads they see, but cannot opt out of Google scanning their data, doing everything possible to determine their location, and compiling those profiles which Google users cannot view, correct, or delete. Many of Google’s products and services may be free to users, but as Tim Cook noted in his open letter, “When an online service is free, you’re not the customer. You’re the product.” Google’s customers are advertisers. During the first half of 2014, 90 percent of Google’s revenue came from advertising.
This isn’t to say Google wants Android or its services to be insecure. Google has always made security a high priority, and has redoubled its efforts in recent years following cyberattacks and mass surveillance revelations. Android has offered user-data encryption on devices as an option since 2011, and Google will be making device encryption a default setting in Android L, due out in a few months.
The distinction is that while Google, Yahoo, Microsoft, Amazon, Facebook, and others want your data to be secure, they don’t want it to be private. They need to know as many specific things about their users as possible to feed their business models. Conversely, Apple claims to access only data it needs for specific features and services, use it only with users’ consent, and (where possible) extends that stance to data that can be collected by third parties. If you don’t have a problem with an app (or a company) tracking your every move, that can be done on an iOS device as easily as an Android phone. But that’s not how Apple’s apps and services behave by default.
To be fair, since Apple designs and markets products for the masses, little that Apple might learn about individual customers could help them sell more products. While there’s no reason to suspect Apple’s public commitment to user privacy is insincere, the company is certainly spinning this basic business model fact into marketing gold.
Of Courts and Canaries — Last year, Apple published its first transparency report on government information requests, which included a so-called “warrant canary:” “Apple has never received an order under Section 215 of the USA Patriot Act.” The untested idea is that while Apple can be barred from disclosing whether it has received an order to disclose information from the mostly secret Foreign Intelligence Surveillance Court (FISC), it cannot be compelled to lie about having not received such an order. If the language were to vanish from an updated version of a document, the reader would be free to infer Apple had received at least such one secret court order.
Some hay has been made of this warrant canary going missing in Apple’s transparency report covering the first half of 2014; in fact, Apple dropped it in the latter half of 2013 following changes that Google, Microsoft, Facebook, Apple, and other companies worked out with the U.S. government that enable aggregate reporting on both FISA requests and National Security Letters (NSLs). In its place, Apple now says “To date, Apple has not received any orders for bulk data.” The implication is that while Apple may well have received FISC orders to disclose information, any such orders have not involved the kind of massive handover or continual monitoring implied by things like the NSA’s PRISM program. If this language disappears in the future, readers could infer Apple has been ordered to hand over data en masse.
Nonetheless, if any part of this highly public privacy site and policy proves to be untrue, the damage to Apple’s brand and reputation would be immense — Apple is putting its neck on the line.
Another thing is clear: Apple now considers privacy consciousness a feature of its product ecosystem, just like high-quality displays, good cameras, speedy graphics, and elegant design. Apple doesn’t think the Googles, Facebooks, Microsofts, and Amazons of the world can compete.