M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |
The encryption on the iPhone is clearly doing its job. Good encryption doesn’t discriminate between attackers, it simply protects data – that’s its job, and it’s frustrating both criminals and law enforcement. The government has recently made arguments insisting that we must find a “balance” between protecting your privacy and providing a method for law enforcement to procure evidence with a warrant. If we don’t, the Department of Justice and the President himself have made it clear that such privacy could easily be legislated out of our products. Some think having a law enforcement backdoor is a good idea. Here, I present an example of what “warrant friendly” security looks like. It already exists. Apple has been using it for some time. It’s integrated into iCloud’s design.
Unlike the desktop backups that your iPhone makes, which can be encrypted with a backup password, the backups sent to iCloud are not encrypted this way. They are absolutely encrypted, but differently, in a way that allows Apple to provide iCloud data to law enforcement with a subpoena. Apple had advertised iCloud as “encrypted” (which is true) and secure. It still does advertise this today, in fact, the same way it has for the past few years:
“Apple takes data security and the privacy of your personal information very seriously. iCloud is built with industry-standard security practices and employs strict policies to protect your data.”
So with all of this security, it sure sounds like your iCloud data should be secure, and also warrant friendly – on the surface, this sounds like a great “balance between privacy and security”. Then, the unthinkable happened.
In September 2014, an estimated 100 high profile celebrities – mostly women – had their dignity stripped from them as their private nude photos were stolen from their iCloud accounts and posted on the Internet. Two years later, the individual responsible was finally arrested after what seems to have been an extensive investigation.
Now consider this: Apple is clearly at an expert-level in creating secure devices – the iPhone is so secure that it’s even frustrating law enforcement at a federal level. The security of iOS devices has become so strong that it’s even gotten the President’s attention this week, in accusing Americans of “fetishizing their devices”. There’s no question that Apple can create a secure product.
So what’s the difference between iCloud and the iPhone? The iPhone, as DOJ puts it, is “warrant proof”, whereas the data stored in iCloud is warrant friendly, and was designed with this in mind. Data in the iCloud is encrypted and heavily protected by Apple, but the encryption is escrowed in a way that Apple has complete access to the content so that they can service law enforcement requests for data.
The iCloud’s design for “warrant friendliness” is precisely why the security of the system was also weak enough to allow hackers to break into these womens’ accounts and steal all of their most private information: An iCloud backup is not encrypted with the user’s backup password like a desktop backup is. Had it been, the data would have been unreadable to a criminal hacker without knowledge of this extra password – a password that the customer is used to only entering into the iTunes GUI (and not iCloud), and would therefore be much harder to phish. A password that most people store on the keychain, because they don’t even remember what it is, and likely wouldn’t even be able to provide in a phishing scam. These people weren’t logging in to restore a backup, and a phish for your backup password would have been just as obvious as a phish for your credit card numbers. The same day as this all occurred, a brute force tool named iBrute was also released, which allowed iCloud accounts to be brute forced without any backoff or other security measures from Apple’s servers. This technique, too, would have turned up only junk data had user data been encrypted with their backup password.
In other words, the data stored in iCloud is stored in a weaker way that allows Apple to service law enforcement requests, and as direct result of this, hackers not only could get into the same data, but did. And they did it using a pirated copy of a law enforcement tool – Elcomsoft Phone Breaker.
Photos weren’t the only piece of data stolen from these individuals. @SwiftOnSecurity posted the EXIF information extracted from these files, which showed a significant number of geotags. This is GPS information telling a stalker or rapist exactly where and when these women took these photos, as well as all of the other photos in their reel. With this data (that many probably didn’t even realize the phone kept), a physical attacker knows the places these women frequent, where they may live, hotels they stay at, and so on. The physical safety of some of these women has been put in jeopardy by the leaking of this data to the public – and is still probably putting some in jeopardy, maybe without them even knowing. The iCloud backups also reveal all of their contacts, address books, and even iMessage history. In short, the complete identities of these women – and personal information about all of the people they communicate regularly with – was exposed to the public.
The fact that this door was held open for law enforcement (and still is) continues to pose a significant public safety risk for anyone storing their data in iCloud. Until such a time that Apple decides to encrypt iCloud data with keys that the user controls, users will continue to be at risk.
Apple and Tim Cook expressed a lot of sorrow and regret in this hack, and to their credit have taken steps to try and improve the security of iCloud. Even with new front-end security mechanisms such as 2FA, however, forensics tools have evolved, and continue to pose a threat to anyone with around $500 (or a pirated copy). Software solutions such as Elcomsoft’s (which are designed as law enforcement tools, not hacking tools) have been updated to support 2FA tokens. Stealing these tokens are as easy as stealing a phone, or even just socially engineering the cellular carrier. If your computer is compromised with malware, these tokens can even be stolen from your desktop so that an attacker doesn’t need to get a 2FA token. Security is still imminently vulnerable to the next big attack – all because iCloud is designed to be “warrant friendly”. Most of the time, users don’t even understand 2FA enough to turn it on, leaving most people still openly exposed.
Apple can’t “fix” the security problem and service law enforcement requests. That was their point when they added better encryption to iOS 8: they want to protect us from today’s more sophisticated attackers, but by doing the encryption right, they’ve locked themselves out of being able to help law enforcement. Apple is now being ordered to break all of this security, which would leave your iPhone exposed on the same order of magnitude as iCloud. I suspect that Apple may have legal liability in actually making iCloud secure; this may be why it isn’t – because they certainly are interested in creating secure products otherwise.
This is the kind of compromise we can expect should the government force Apple to backdoor the iPhone. The argument that the device could still be secure against criminal hackers has been flat out disproven by watching this play out in iCloud. Compromising security = compromising privacy. As one Twitter follower said, privacy and security are interdependent; they’re the same side of the proverbial coin. You can’t balance a one-sided coin. Compromising security for the sake of prosecuting crimes will inevitably create more crime, as it has with iCloud.
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |