The Good News about the CIA Targeting Apple
On 10 March 2015, The Intercept revealed how the U.S. Central Intelligence Agency targets Apple hardware and software. Based on top-secret documents exposed by Edward Snowden, the article implies that Apple’s Xcode development environment, iOS encryption keys, and Software Update may all have been compromised by the CIA in the name of broad surveillance. Fortunately, the truth is more nuanced.
While I have no doubt the CIA (and every other intelligence agency and criminal organization in the world) targets Apple along with all the other major technology companies, that doesn’t mean they necessarily succeed, or that any successes last long. It’s also important to note that most of the attacks listed in the article are better suited for targeting individuals, not the mass spying that has so dominated the headlines.
Apple has been a particular target since the company started taking privacy and security both seriously and personally — even beginning to architect parts of its products and services to resist government-level attacks. But Apple isn’t perfect. The company still struggles to find the right balance between security, privacy, design, and usability in the face of advanced criminal- and government-level attacks.
The Intercept’s piece was mostly a condemnation of government tactics, but failed to address what matters to Apple’s customers — whether Apple’s products remain secure and safe to use. Overall, the information is quite optimistic, but the article highlights the complexities of modern security, privacy, and intelligence gathering. We are still in the early days of what is likely to be a generational issue as society continues to adjust to the digital age.
Because That’s Where the Intelligence Is — As a taxpayer, if the CIA and every other U.S. intelligence agency doesn’t target Apple products, I want a refund.
Our instinct is to express outrage at U.S. intelligence agencies targeting the products of U.S. companies, but this is far from the first time it has happened, is far from the last time it will happen, and is absolutely essential for those agencies to do their jobs. As the entire world relies completely on technology for all forms of communication, tapping into that technology is critical for intelligence gathering.
As much as it might pain me to live in a world that requires spies, that’s the reality. When U.S. companies dominate a tech sector, it’s only natural that intelligence agencies will target their products. We in the security industry have long suspected that agencies have planted employees in manufacturers and supply chains or engaged in direct operations, in most cases targeting products destined to specific countries. Over the past two years, those rumors and jokes at the bar at security conferences have migrated to the headlines.
While The Intercept’s article focuses on Apple, it also mentions intelligence agency attempts to target Microsoft and the chip manufacturers who make the Trusted Platform Module (not a Microsoft-specific product) used to help power encryption in various platforms. This is on top of previous leaks indicating potential government attacks that have compromised all sorts of encryption hardware, software, and services.
Put bluntly, intelligence agencies must target U.S. companies since those companies’ products — whether we’re talking the iPhone or Windows — are used by all their international targets.
I actually find The Intercept’s article highly encouraging. The fact that government researchers needed to expend such intense effort to access devices and services shows that the security technologies Apple and others provide for us are effective (if not perfect) and the technology companies aren’t simply rolling over for the intelligence community, which would be far more concerning.
Hard Targets — Much has happened in the security world since 2012, when the documents revealed by Edward Snowden were created, including major changes to the iOS security model. Let’s look at the main products mentioned in the documents, and what the risks are.
Keep in mind, I’m not dismissing these risks. But by every reading they seem targeted, temporary, and self-limiting.
Xcode is the tool all Mac and iOS developers use to create applications. The government researchers claimed that they created a cracked version of Xcode that could embed attack code into any apps created using that version, without the developer knowing about it. If someone were able to compromise the canonical version of Xcode distributed by Apple, it could affect every application, anywhere. It’s the sort of thing that keeps security-conscious developers up at night.
Compromising a developer tool chain, especially a compiler, is as bad as it gets. Imagine the capability to insert malicious code into major, popular applications without the developer knowing about it. But believe it or not, there is an upside here. Based on the researchers’ presentations, they don’t have access to the canonical version of Xcode itself, but instead need to target a particular developer and swap out his copy of Xcode on his particular computer. That’s the sort of attack I’d rather see, since it means intelligence agencies can target developers in foreign countries on an individual basis (to get to the users of a particular developer’s apps), and not compromise the entire ecosystem of Apple
development.
Even if the CIA were able to get to the canonical version of Xcode, every Xcode update could eliminate the malicious version, unless the Software Update mechanism were also compromised, depending on how often the developer updates.
In fact, software update mechanisms are another high-value target, and one the documents indicate government researchers have worked on. But as with Xcode, unless Apple’s entire Software Update infrastructure were compromised, the attack is limited to specific targets.
iOS encryption is so effective that law enforcement agencies want the government to compel Apple to provide a back door. It’s no wonder intelligence agencies want to crack it. According to The Intercept, researchers targeted the Group ID key used by all Apple devices sharing the same hardware. While this key could help an attacker install malicious software on the device, it can’t be used to recover data on the device if the user sets a good passcode.
There is some nuance here, in that a malicious system software update could potentially expose all the data on a device under some circumstances (the user doesn’t know it was installed, and keeps using the device). Cracking the GID key can’t, for example, allow law enforcement to extract data without the device being unlocked and accessed. If that were true, Apple (as holder of the GID key) could access any user’s data for any law enforcement agency, which the company has explicitly and vociferously claimed is impossible.
In each case, the attacks are serious, but they also show just how hard it is to break into Apple devices, especially iOS devices. None of the attacks seem well suited for mass surveillance, since that would increase the potential for exposure. As the researchers themselves stated:
The Intelligence Community (IC) is highly dependent on a very small number of security flaws, many of which are public, which Apple eventually patches.
In short, the intelligence community wants to keep their attacks secret so they can be used in highly targeted ways, against specific people. I can live with that.
For Apple, It’s Personal — Since those 2012 presentations, Apple has taken even stronger stances to improve security and privacy. In a Macworld article last year, I mostly attributed this to economic reasons, but in ongoing background conversations with people at Apple, it has become clear they take privacy personally.
As an industry analyst, I tend to think of corporations as being driven purely by the bottom line, but that isn’t always the case. Even eliminating any altruistic motivations, Apple is a company where the employees’ entire lives revolve around both the creation and the use of their products and services.
I’m pretty sure the thought of being spied upon gives Apple executives and employees the same itchy feeling that the rest of us feel. Worse, I think they feel that they will have failed technically if their products turn out to be easily hacked — there’s little that developers hate more than being shown how their code could be subverted. No one wants to see the sort of condemnations Apple faced when a simple programming mistake led to a serious flaw in SSL.
This shows through in Apple’s product decisions. iCloud Keychain can be configured to be essentially NSA-proof (see “How to Protect Your iCloud Keychain from the NSA,” 1 March 2014). Apple even physically destroys the hardware smart cards needed to access key management appliances on the off chance that an employee is actually serving multiple masters.
Both FaceTime and Messages support end-to-end encryption. Apple can technically manipulate and change that process, but not easily since it would require what appears to be a major architectural update. And the company’s stance on iOS encryption is clear.
Think about it from Apple’s position. For reasons beyond its control, Apple is now the biggest technology target in the world, from criminal organizations, foreign intelligence agencies, and even its own government. And what is being targeted are the tools and services that all Apple employees rely on every moment of every day. It’s hard to find greater motivation to double down on security and privacy, and the fact that even CIA researchers struggle to maintain offensive capabilities against Apple products shows that these security investments pay off.
That’s strong motivation, especially since it’s the job of spies to spy, even when it means finding flaws in products made at home, used abroad. Apple security isn’t perfect — new flaws will continue to be revealed, likely more often than any of us would like — but now that Apple has upped its security game to block even the most capable intelligence agencies in the world, the main beneficiary is the average user.
Thanks Rich. As I've always said, you're my go to guy when it comes to a non-hysterical, well explained and well reasoned discussion of these kinds of security threats and issues.
Thanks Shawn, means a lot.
Great article, Rich! I completely agree with Shawn's view. Very balanced perspective on a very complex and scary subject. Keep it coming.
This makes me wonder how secure the Caching Server in OS X Server is. It provides caching of Apple's software updates. Perhaps that can be compromised and provide 'bad' updates. Clients do not set any preferences to use that server -- it happens automagically.
An interesting question - OS X Server might thus be a target in its own right then.
The lastest version of the Risky Business security podcast (risky.biz) discusses this towards the beginning of the podcast, as part of the week's news. Both host and guest thought the media reports were hyperbolic and said that if you looked at the report it seems clear that the Feds were primarily looking at ways to crack phones they'd taken physical possession of. They didn't think much of the story....
Excellent explanation of the How & What angles, and I'm in complete agreement with your attitudes toward Why.
I think the most probable success for attackers would be with third (fourth?) party frameworks, such as ones that serve ads, or provide error reporting to the developer. Many of these are from small companies who may or may not have a good grasp on basic security in the first place, and they provide code to many developers. Compromise one of those, and you've easily caused every app that uses it to tell the bad guys everything that can be grabbed by the app. Many of those apps will be given access to Contacts, Calendar, location, microphone, photos, social accounts, because that's why you installed them in the first place.
It would be good for developers to disclose which third party code they use, if any. It would make it easier for users to avoid some potential holes, and make it easier for researchers to vet individual frameworks.
I grew up in a country with a "strongman rule" so this is old news to me. The "new country" is no different than the old one.
I have noticed long delays in iOS where the screen does not accept input and have worried that one of my apps is doing bad things in the background
Luckily, it's far, far more likely that it's just bad programming. Unless, of course, you're an international man of mystery, wanted on seven continents. In that case, they are out to get you. :-)
Hi Rich
Its a shame that you feel that way about surveillance Why don't you give me your email address and password and let me have a look at your correspondence. I guess you would like that. Same for you bank passwords. I'll just have a look around. Think twice about surrendering your privacy.
While it’s good news that the CIA has had limited success in targetting Apple’s platforms, that does not excuse irresponsible behavior. Breaking into other peoples computers to steal their information is a serious felony in the US and pretty much every other industrialized nation. Why is it OK for the CIA to do this? It undermines trust which is essential to democratic self government and global markets. To seek an unfair advantage over your friends and trading partners is not OK. It’s sociopathic and corrupt regardless of what you imagine others might be doing.
The NSA should be helping us to protect our information, not waging an arms race to spy on everyone.
While it’s good to hear that the CIA has had limited success in targeting Apple’s platforms, that does not excuse irresponsible behavior. Breaking into other peoples computers to steal their information is a serious crime in the US and pretty much every other industrialized nation. Why is it OK for the CIA to do this? It undermines trust which is essential to democratic self government and global markets. To seek an unfair advantage over your friends and trading partners is not OK. It’s sociopathic and self corrupting regardless of what you imagine others might be doing.
I'm glad Apple has come down on the side of protecting their customers. Bravo!
This is a very sane and helpful article. While hysteria may generate clicks, it doesn't make me want to regularly read what's on a website. This does.
"In short, the intelligence community wants to keep their attacks secret so they can be used in highly targeted ways, against specific people. I can live with that."
So can I. However, the problem today is that EVERYONE is being targeted by the secret, uncontrolled collection and conservation of massive amounts of data.