iPhone 3GS Hardware Encryption Easy to Circumvent
A mere three days after I published an article touting the enhanced security of the iPhone 3GS – see “iPhone 3GS Offers Enterprise-Class Security for Everyone“, 2009-07-20 – security researcher Jonathan Zdziarski revealed a simple, only moderately technical technique for completely circumventing the iPhone’s passcode lock and encryption. As a result, the iPhone 3GS encryption can no longer be considered a security control for consumers or enterprises until Apple releases a fix.
Although encryption is one of the most fundamental tools available in the security arsenal, it can be difficult to implement properly. In this case, it isn’t that the encryption itself is flawed (although that happens), but that the implementation of the encryption leaves cracks for attackers.
Implementation issues that can hamper encryption security include generating keys improperly, protecting them poorly, exchanging them insecurely – and even leaving doors wide open such that the encryption can be sidestepped entirely. This has allowed exploits in WEP (Wired Equivalent Privacy) in Wi-Fi (which also had cryptographic flaws), early SSL implementations in Web browsers, and stored passwords in most major operating systems.
It appears that Apple made a fundamental mistake in encrypting the iPhone 3GS. It’s a mistake we’ve seen before in other tools, but one Apple has managed to avoid elsewhere, such as Mac OS X’s FileVault.
A Flawed Implementation — Encryption works by taking data and running it through a mathematical algorithm that scrambles the contents. But unlike sticking it in a blender, you can reconstruct the original data by reversing the process – assuming you have the right key. (In symmetrical cryptography, the same key is used to encrypt and decrypt; in asymmetrical flavors, like public key encryption, one key encrypts and another related key decrypts.)
The longer and more complex the key, the better protected the data. While different algorithms use different key lengths, the standard encryption tools today usually use 128- or 256-bit keys for symmetric encryption. Since 256 bits of random data is a bit harder to remember than the average lock combination or telephone number, we usually protect the key itself with a password.
If you use a weak password, the attacker can potentially guess his or her way in and access your data, but that’s not the mistake Apple made. On the iPhone 3GS, your password is simply the passcode to unlock your phone, and the device can be configured to erase the encryption key – making your data inaccessible – if someone tries to brute force their way in.
If you have the iPhone configured properly, as I detailed in my previous article, the attacker gets only 10 tries to guess your passcode before your data is erased from the iPhone. It’s this very feature I considered “enterprise-class” when I wrote the initial article.
What Jonathan Zdziarski discovered is that if you can bypass the passcode, you gain complete access to the data. And this is fairly easy to do using the same jailbreaking tools people use to hack and personalize their phones.
Although I don’t know the full technical details, it seems that by jailbreaking the iPhone you can access the part of the iPhone that stores the passcode directly, and turn off its required use; or install a program to allow network access to the iPhone’s storage. Using either technique, you then gain full access to the data on the iPhone.
A Known Problem — This isn’t the first time we’ve seen this kind of encryption mistake. Since we have to use passwords instead of encryption keys to interact with users, how we set up those passwords can open up doors for attackers.
For example, with early versions of Microsoft’s Encrypted File System you could use special tools to erase a user’s password if you had physical access to their system. That let an attacker simply log in without a password and access the data.
Microsoft fixed this by using two different passwords that were synchronized by the operating system. One is the normal password for logging in, while the other allows access to the encrypted data.
If you changed your password using the normal method, they would stay in sync. But if you used some sort of a hacking tool to change the login password, it would break the synchronization, preventing access to the encrypted data. Apple’s FileVault works in a similar way.
While this is speculation, it seems the iPhone 3GS makes a similar mistake. Jailbreaking the iPhone appears to allow access to the memory location that stores either the passcode, or the setting to use the passcode. With this removed, you gain full access to the iPhone.
It also appears that you can jailbreak the iPhone and install a tool like SSH, which you can then access over the network to pull the data off the device. The iPhone doesn’t realize normal access is being circumvented, and automatically decrypts the data without requiring the passcode.
Testing the Hack, and Discovering a New Problem — Just to make sure, I tested the jailbreaking process using a computer that had never been authorized to have access to the iPhone. To sync a passcode-protected iPhone with iTunes, you need to enter the passcode in iTunes. The process worked smoothly, I was never prompted to enter a passcode, and with a little more effort I could have modified my jailbreak package to install and run SSH automatically.
Actually, the process went a little too smoothly, and in the process I discovered a second vulnerability in the iPhone. While minor, I reported this to Apple and will not be releasing more information until it’s patched.
Until Apple resolves these issues, the encryption on the iPhone is little more than a speed bump to anyone with moderate technical skills and physical access to the device. If you lose your iPhone, it’s now even more important to wipe it remotely with MobileMe as soon as possible, since that completely destroys the key and protects your data.
Since this isn’t an unknown implementation mistake, Apple should have a clear roadmap to fix the issue and make the iPhone 3GS a secure device for non-business users and enterprises alike.
Apple had hired a security guy to beef up security on OSX desktop, but now they're also looking for an iPhone security specialist.
I hope they get somebody that worked at RIM, otherwise the iPhone is going to be the laughingstock high-security smartphone. Hard to believe Apple could make such a grave mistake, but the on-board encryption is a new thing for them. They'd better beef up the iPhone twice as much as any smartphone out there or no business is going to touch it.
Actually, I'm not so sure it was so much a mistake as an implementation they didn't think through. The encryption key is there to allow the easy wipe of the data on the phone, and it does that quite well. What it doesn't do is actually protect the phone's data from a malicious attack from someone with the phone in hand. While it *should* do this, just the remote wiping and failed unlocking wiping is what Apple was after.
I am confident that 3.1 or so will have better protections, but keep in mind that the security on the iPhone is already considerably better than on most any other phone, where bypassing the 'lock' is trivial and there's no easy way to wipe personal data.