In a recent announcement, Apple stated that they no longer unlock iOS (8) devices for law enforcement.
“On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”
This is a significantly pro-privacy (and courageous) posture Apple is taking with their devices, and while about seven years late, is more than welcome. In fact, I am very impressed with Apple’s latest efforts to beef up security all around, including iOS 8 and iCloud’s new 2FA. I believe Tim Cook to be genuine in his commitment to user privacy; perhaps I’m one of the few who can see just how gutsy this move with iOS 8 is.
It’s important to take a minute, however, to note that this does not mean that the police can’t get to your data. What Apple has done here is create for themselves plausible deniability in what they will do for law enforcement. If we take this statement at face value, what has likely happened in iOS 8 is that photos, messages, and other sensitive data, which was previously only encrypted with hardware-based keys, is now being encrypted with keys derived from a PIN or passcode. No doubt this does improve security for everyone, by marrying encryption to the PIN (something they ought to have been doing all along). While it’s technically possible to brute force a PIN code, that doesn’t mean it’s technically feasible, and thus lets Apple off the hook in terms of legal obligation. Add a complex passcode into the mix, and it gets even uglier, having to choose any of a number of dictionary style attacks to get into your encrypted data. By redesigning the file system in this fashion (if this is the case), Apple has afforded themselves the ability to say, “the phone’s data is encrypted with a PIN or passphrase, and so we’re not legally required to hack it for you guys, so go pound sand”. I am quite impressed, Mr. Cook! That took courage… but it does not mean that your data is beyond law enforcement’s reach.
In a recent blog post, I outlined a number of measures Apple took with iOS 8 to prevent many forensic artifacts from being dumped off of the device by existing commercial forensics tools. These services had completely bypassed the user’s backup encryption password, affording the consumer virtually no protection from the many law enforcement forensics tools that took advantage of these vulnerabilities. Apple closed off many of these services in iOS 8. This was a great start to better securing iOS 8, but not everything has been completely protected.
In addition to what’s been fixed, I also outlined some things that haven’t yet been. What’s left are services that iTunes (and Xcode) talk to in order to exchange information with third party applications, or access your media folder. Apple wants you to be able access your photos and other information from your desktop while the phone is locked – for ease of use. This, unfortunately, also opens up the capability for law enforcement to also use this mechanism to dump:
Existing commercial forensics tools can still acquire these artifacts from your device, even running iOS 8. I have tested with my own private forensics tools, as well, and confirmed this. I dumped all of my third party application data (including caches, databases, screenshots, etc), as well as my camera reel and other media… all within a few minutes and from my locked iPhone running iOS 8 GM.
There is one big caveat though, but it’s not a big problem for law enforcement. This technique requires access to a trusted pairing record on a desktop / laptop machine that is paired with your phone, and as of iOS 8 requires physical access to the phone. What does this mean? This means that if your’e arrested, the police will seize both your iPhone and all desktop / laptop machines you own, and use files on the desktop to dump and access all of the above data on your iPhone. This can also be done at an airport, if you are detained.
How does it work? While your photos and messages might indeed now be encrypted with a key derived from your PIN, the pairing records stored on your desktop have a “backup copy” of your keybag keys (the escrow bag), which can be used to unlock the encryption on your phone – without a PIN. Again, this was added so that iTunes could talk to your phone while it is still locked.
Fortunately, there are some precautions you can take to ensure your privacy. One small trick is to shut down your iPhone whenever you go through airport security or customs. Why? Because Apple has included a kill switch that prevents your pairing records from being able to unlock your iPhone if it’s been shut down. The pairing record vulnerability only works if you’ve used your phone since it was last rebooted. Secondly, make sure you’re using strong encryption on your desktop / laptops, and make sure your computers are all shut down when not in use… especially when going through airport security. There are a number of forensics tools capable of dumping the memory (and therefore, encryption keys) of your encrypted disk if you’ve left your computer asleep or in hibernate mode. Shut it down.
While setting a backup password is critical to protecting the rest of your private data on an iPhone, it won’t help you here, because none of these interfaces honor the consumer’s backup password. Your data is not encrypted when dumped from these services. If you don’t lock your device with a PIN, of course, all of this data is at risk all of the time, so be sure to use a PIN too.
Apple could stand to greatly improve this by either requiring the user enter their backup password for iTunes to talk to it while the device is locked (and encrypt all of this with that password), or to simply offer the user the option (via iTunes) to prevent the iPhone from being accessible at all while locked. Many users would gladly check that box for improved security.
Apple has done a great job of breaking a number of law enforcement forensics tools and features with the release of iOS 8. Some existing features are still likely to work, however, and your third party application data and media folder are still potentially at risk from anyone with access to these commercial tools, or someone with the know-how to use open-source tools such as libimobiledevice.
On a philosophical note, some seem genuinely upset about Apple’s latest decision, arguing law enforcement is “entitled” to your data, in order to fight crime. The other side of the coin is this: should manufacturers be required to weaken the strength of their encryption (and the security of their products) just to make law enforcement forensics possible? Wouldn’t that amount to engineering back doors into all products? If you still feel this way, consider also that by improving the security of their products, Apple has improved it for everyone – CEOs, the President (who’s been seen using an iPad to receive daily briefings), congressmen, judges, our own military and many others. If you’re going to weaken security to make forensics possible, you’re also weakening it for everyone, opening the door for foreign governments and cyber criminals to attack all of us. For the sake of privacy and overall security, the only logical solution is to make products as secure as possible, and let good detective work do the crime solving, rather than an easy button.