XcodeGhost Exploits the Security Economics of Apple’s Ecosystem
On 17 September 2015, researchers at Palo Alto Networks wrote an analysis of a new piece of malware in the Apple ecosystem in China. Originally discovered by Chinese developers, XcodeGhost uses an interesting vector. Instead of going after iOS or OS X directly, the attackers targeted developers who downloaded unofficial versions of Apple’s Xcode development toolkit. These developers then unwittingly used modified versions of Xcode that inserted malicious code into apps later released to the App Store.
It’s the kind of attack that might never have made headlines had it not affected dozens of Chinese apps, including some of the most popular in China. It’s important to realize that the attack is largely limited to Chinese developers and Chinese apps; although a few non-Chinese apps were infected (see the list in the linked article), most iOS users elsewhere in the world don’t need to worry. Also, now that the news is out, Apple and developers are clearing infected apps from the App Store.
Reuters dubbed the situation the App Store’s “first major attack.” This is a tad misleading, since the App Store itself wasn’t attacked, but XcodeGhost was clearly successful, and shows, possibly for the first time, attackers effectively exploiting the security economics of the Apple ecosystem.
It’s All About the Money — Security is, in many ways, far less about technology and far more about economics. The days of bored teenagers trying to take down the Internet are long over, and now there is real money to be made in cyberattacks. Whether it’s corporate or international espionage, or attacks on the finances of normal people, it all comes down to the cost of the attack versus the potential gain.
What kind of gain are we talking about? Wired is reporting that a new firm called Zerodium is offering $1 million for a zero-day exploit that can break into a device running the just-released iOS 9. This isn’t like a company paying for security breaches in its own software so they can be fixed; Zerodium is essentially a digital arms dealer peddling breaches to major corporations and government organizations. (Here’s how we imagine the press conference going.)
Apple recognized early on that if it prevented a malware ecosystem from gaining a profitable foothold, the vast majority of its customer base would never experience a serious attack. Microsoft learned this lesson the hard way, and the company still suffers the consequences of underinvesting in security over a decade later. Once an ecosystem evolved around security vulnerabilities in Windows, Microsoft couldn’t completely disrupt the economics, even as Windows became a highly secure platform. As a result, we still see more malware on Windows than Macs, even though modern versions of Windows are more “secure.”
Apple’s efforts started for real with iOS. By making iOS a closed ecosystem, with apps available only through the App Store, and only then after review, Apple gave itself the inherent advantage of centralized control. Apple also completely controls all iOS-compatible hardware, giving the company yet another set of screws to tighten. As a result, iOS is the most secure consumer computing platform available, as proven by ongoing government frustrations that passcode-protected devices can’t be accessed by law enforcement.
In conversations with Apple over the years, it has become clear that the company deliberately focuses on security economics, even over specific technical defenses. Apple doesn’t believe all attacks can be stopped, and certainly not those from governments or well-funded criminal organizations, but if you make the cost of attack higher than the potential gain, you knock out entire categories of bad guys and reduce the impact on users. The App Store, code signing, and sandboxing all work together to raise the cost of attacks.
OS X lags iOS in terms of security, but is a better example of the economic equations at work. Apple doesn’t control the ecosystem as tightly, so it developed Gatekeeper to limit the chance that a user will download and install software from untrusted sources (see “Gatekeeper Slams the Door on Mac Malware Epidemics,” 16 February 2012). As a result, OS X isn’t as locked down as iOS, but as general purpose computers, Macs play a different role than iOS devices, and Gatekeeper’s protections are “good enough” for most users. That’s why I claimed that it disrupted any chances for a mass malware market.
Other protections, like OS X’s built-in XProtect antivirus checking and even OS X 10.11 El Capitan’s new System Integrity Protection, reinforce Apple’s Gatekeeper approach. XProtect provides a mechanism to block malware that might become sufficiently widespread, without requiring users to install other antivirus software, and System Integrity Protection keeps malware from gaining a deep foothold on a compromised system, strengthening the usefulness of XProtect.
All these technologies are trivial for a well-funded attacker to bypass when targeting an attractive mark, but perfect security isn’t Apple’s objective (yet). Instead, Apple is trying to make the effort needed to create a widespread attack too expensive to be worthwhile.
XcodeGhost Attacks the Economics — That’s why XcodeGhost is so interesting. It targets developers who, due to bandwidth limitations in China, seek out unofficial downloads of Apple’s Xcode development software. The malware infects all apps compiled by those developers, which are then loaded into the App Store, and then downloaded by users, potentially infecting millions of users.
While clever, particularly in the way it took advantage of how the Chinese government restricts access to Apple’s servers, XcodeGhost is neither the first time that iOS devices have been attacked through Macs, nor the first time that an attack has gone higher up the development chain to target Xcode. In 2014, the Wirelurker family of malware was discovered to attack iOS devices from Macs via USB, using a variety of techniques to generate malicious iOS apps, infect installed iOS apps, and install third-party apps on non-jailbroken iOS devices through enterprise provisioning. And earlier this year, The Intercept reported on top-secret
documents that claimed that the U.S. Central Intelligence Agency had modified Xcode to sneak surveillance backdoors into any app created with the tool. And while the CIA apparently didn’t have a plan for how to get developers to use its modified version of Xcode, there have been “watering hole” attacks that targeted iOS developers.
XcodeGhost’s approach avoids some problems that are very hard — and thus very expensive. Attacking the App Store directly is considered nearly impossible, and sneaking an app that contains malware past Apple’s review, while definitely possible, is both chancy and unlikely to trick sufficient users for long enough to be worthwhile.
By targeting Xcode and playing off its unofficial distribution network, the organization behind XcodeGhost both cut costs by sidestepping most of Apple’s security apparatus and was able to leverage a relatively small number of developer infections to attack millions of users.
However, XcodeGhost’s window of opportunity is small, not least because both Apple and the developers of infected apps want to eliminate these apps from the App Store as quickly as possible. Apple won’t stop there. The company can, with some technical work, force Xcode redownloads from a trusted source for a short-term solution.
Longer term, Apple needs to enhance the security of the entire developer ecosystem to reduce these chances of attacks, just as the company has addressed the economics of direct malware. For example, Apple could embed digital certificate pinning and better app signing into Xcode, which would reduce the chances that a compromised version of Xcode could be used. These encryption-based technologies can help detect both modified applications and modified communications channels. (Last week, Google detected forged certificates created by Symantec thanks to pinning, which hardcodes specific certificates into a browser or operating
system, and detects when unexpected versions are encountered.)
Of course, Apple needs to address the bandwidth problem that caused Chinese developers to look for unofficial downloads in the first place. Apple should also be looking to harden OS X further from targeted attacks, since it’s the weakest link in the security chain, and one that can be leveraged at the developer level to pivot to a wider attack.
Speaking of which, iOS developers — particularly those who work on popular apps — need to be aware that they are now significant targets. Even Apple may look less closely at app submissions from long-time developers, so the bad guys are going to do everything they can to get their code into apps that Apple has approved many times, and that already have many users. That’s the sweet spot for attacks these days, at least until bad guys become as good at developing popular apps themselves, start making more money legitimately, turn to the side of good, and are then compromised by their former criminal peers. Stranger things have happened.
In the end, that old dictum of journalism — “follow the money” — is what anyone concerned with security needs to think about as well. No technology can be perfectly secure, but by looking for places where a relatively small effort can be leveraged into a significant attack, security engineers can make attacks ever more expensive and thus limit them to highly specific situations. That has been Apple’s focus for some time now, but the company needs to apply that lens to its entire ecosystem, from the moment code is written to the point where an app is launched by a user.
"It’s important to realize that the attack appears limited to China; Apple users elsewhere in the world don’t need to worry."
There are a number of English-language apps on the list that Fox-IT released, including WinZip - maybe not widely used, but not unknown either - so I'd disagree that non-Chinese users "don't need to worry".
Yeah, I should waffle that more - I think Apple and developers will be shutting down all known infected apps pretty quickly, but it's not quite as complete as implied.
Nothing wrong with practising security conscious selectivity when considering an application to download and use. I have avoided Chinese apps for several years as there is no guarantee they are secure. Same thing with Russian apps and a few others. One can follow current affairs to get a good idea of which app countries of origin to avoid. It has not made it difficult for me to find the right app for the task at hand.
LOL @ this.
"...and sneaking an app that contains malware past Apple’s review, while definitely possible, is both chancy and unlikely to trick enough users for long enough to be worthwhile."
That's exactly what just happened. Wide scale. Nothing chancy. Thanks to Apple not checking app submissions come from unmodified XCode and not carefully inspecting and testing code for malicious behavior.
I guess having the combined GDP of several nations in cash in your bank account isn't enough to hire a few engineers to QC app updates. There goes Apple's security rep.
I'm thoroughly disappointed Apple has been had so easily. This didn't come out of thin air. Any security expert could have warned them about this vector. But no, let any modifed version of XCode upload. And sure, let everything be updated as presto as possible because having a gazillion updates per day and pushing them to users within seconds is the sexy material that makes headlines. Quality be damned.
I miss the old Apple.
I don't think Apple "let" any modified version of Xcode upload. The site from which this ghost version of Xcode came was not an Apple site. Perhaps you're not familiar with torrents, but unauthorized versions of software and other digital content have been around almost as long as the Internet itself. It's China, after all, not Apple that does not permit the bandwidth necessary for developers to download Xcode in an expeditious manner. Not that Apple is without faults of one kind and another, but there's no need to manufacture blame where it does not exist, unless you've got an ax to grind and don't care how you grind it.
As for Apple's QC engineers, this stuff came from a previously unknown vector so they would have had no reason to be looking for it. Yes, it was found—by a company whose business it is to find security problems. Every major business, including Google, Adobe and Microsoft as well as Apple, depends on outside sources to find these problems because no one company, however rich, can do it all. Admittedly Apple is not always as cooperative with these third parties as they might be. But that's another story, not relevant in this case.
That's exactly what we're suggesting with digital certificate pinning and better app signing. I strongly suspect that Apple simply didn't imagine that there would be any reason to get Xcode from a non-official site before this - that's a lack of imagination, for sure, but Apple has for a very long time now assumed that everyone has more bandwidth than is necessarily the case. So it's not surprising this assumption would exist in the security team as well.
It's not just about bandwidth. Apple could have imagined that a "bad guy" somewhere would try get a dev to use his modified version of Xcode. A nice twist on the old man in the middle.
So just to be clear, it required neither China nor poor bandwidth for this to work. What made it possible was Apple being careless about what kind of binaries they accept into the store.
Gatekeeper is designed to vet downloaded code, no matter where you got it from. Even if you got it from The Torrent From Hell, Gatekeeper can and will tell you if it's been corrupted.
This worked; Gatekeeper does identify the hacked Xcode as, well, damaged and recommends you trash it and get a better one. The developers involved must have turned off Gatekeeper intentionally to actually be able to run the hacked Xcode.app. Gatekeeper won't let you override security on a hacked program. You must turn it off ("From Everywhere") to proceed.
I do not see what that has to do with bandwidth shortages, real or imagined. The two issues (download sites and Gatekeeper) are entirely independent. Yes, that is true even if a (genuine) Mac App Store-originated Xcode.app is redistributed over web sites.
Actually you can override Gatekeeper when you first launch an app. Right-click on the app to launch it and select open and you're off and running regardless of the source of the application. So no need to turn it off.
The "chancy" comment referred to a new app. Apple looks more closely at new apps, so it's more likely that such a malware-ridden new app would be caught in favor of a malware-ridden existing app. And the rest of that comment merely points out that the problem with sneaking a new malware-ridden app past Apple is that the app has to be downloaded by a lot of people for it to have significant effect - that's a high bar for a new app to hurdle.
I'm sure they do look at new apps more closely. Hence hackers will look for ways to compromise via updates to existing apps.
I think rather than just "look at" new apps or updates, what Apple should have done a long time ago is make sure that only binaries built with their official version of Xcode are let into the store. Any modification of the Xcode basis should render binaries that Apple immediately sorts out. Such systems are known and widely used elsewhere.
Why Apple only now is starting to look into this is beyond me. But as a user who relied on Apple to guarantee "a more secure (walled) garden" I am thoroughly disappointed. And their rep among the general public suffered a massive blow thanks to this blunder.
4000 plus affected apps and climbing. can someone confirm with the malicious code added by XcodeGhost, if it also included self-propagation code capable of spreading to clean apps, etc.? What a nightmare!
Not much space to explain here, but I just wrote about this: http://brockerhoff.net/blog/2015/09/29/rb-app-checker-lite-and-ghosts/
Basically, I don't think there's a general way to detect this sort of thing just by examining the app binary — not even Apple can do it without running the app.
XcodeGhost just inserts categories on Cocoa classes (perfectly legal and not unusual), doing things like network access that are allowed by the app's entitlements.