M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
I’ve heard a number of people make an argument about Apple’s authentication front-ending the services I’ve described in my paper, including the “file relay” service, which has opened up a discussion about the technical definition of a backdoor. The primary concern I’m hearing, including from Apple, is that the user has to authenticate before having access to this service, which one would normally expect would preclude a service from being a backdoor by some (but not all) definitions. This is a valid point, and in fact I acknowledge this thoroughly in my paper. Let me explain, however, why this argument about authentication is more complicated and subtle than it seems.
Most authentication schemes are encapsulated from weakest to strongest, and are also isolated from one another; certain credentials get you into certain systems, but not into others. You may have a separate password for Twitter, Facebook, or other accounts, and they only interoperate if you’re using a single sign-on mechanism (for example, OAuth) to use that same set of credentials on other sites. If one gets stolen, then, only the services that are associated with those credentials can be accessed. Those authentication mechanisms are often protected with even stronger authentication systems. For example, your password might be stored on Apple’s keychain, which is protected with an encryption that is tied directly to your desktop password. Your entire disk might also be encrypted using full disk encryption, which protects the keychain (and all of your other data) with yet another (usually stronger) password. So you end up with a hierarchy of authentication mechanisms that get protected by stronger authentication mechanisms, and sometimes even stronger ones on top of that. Apple’s authentication scheme for iOS, however, is the opposite of this, where the strongest forms of authentication are protected by the weakest – creating a significant security problem in their design. The way Apple has designed the iOS authentication scheme is that the weakest forms of authentication have complete control to bypass the stronger forms of authentication. This allows services like file relay, which bypasses backup encryption, to be accessed with the weakest authentication mechanisms (PIN or pair record), when end-users are relying on the stronger “backup encryption password” to protect them.
Consider a pair record; it is a relatively stronger form of authentication than a PIN, as it stores a set of keys on your desktop machine. Stealing a file off of someone’s desktop is slightly more involved than simply looking over someone’s shoulder to see what PIN they type in (or looking at smudges, or what have you). Yet, the weaker form of authentication (the PIN) allows anyone to generate a new pairing with any device. So in order to break through the stronger “pairing” form of authentication, all you have to do is bypass the weaker.
The backup password is probably the strongest form of authentication you can set on your iPhone, and it will (when working the way it should) make sure that all of the data that comes off the phone is encrypted. Where this falls apart is that it can be bypassed (through file relay) through both weaker forms of authentication. For example, you’d have to really work hard to steal someone’s backup password; e.g. install a key logger on their computer and wait for them to (ever) type it in (could be a year or longer), etc. But the subtlety of this “backdoor” service is that you don’t have to. While user is relying on this strong form of authentication (something they know, and don’t often enter publicly), all you have to do is beat one of the weaker forms of authentication, such as grabbing the pairing file from their desktop (which can be done a number of ways), or just their PIN; the weakest form of authentication (which can also be stolen a number of different ways). What makes this attack even more dangerous is that it can, in many cases, be done wirelessly, even if the user has turned off WiFi Sync. What makes this attack even more dangerous is that pairing records are persistent and opaque to the user, and only get deleted when you wipe or restore your iPhone, so now you have wireless, persistent acquisition of a target, bypassing the stronger forms of authentication, without the user’s knowledge that a new pairing even exists (or was compromised). In my opinion, that’s a recipe for a “backdoor”, even if it wasn’t intentionally designed as one by Apple.
We’ve been using passwords as a form of authentication for quite some time; the idea of “something you know” is drilled into consumer minds far more than the “something you have” notion. Both make a good authentication mechanism (e.g. two-factor). If you’re not going to use two-factor authentication, however, you’d expect consumers to understand how a password works much more than how a pair record works. Unfortunately, it’s very easy to steal “something you have”; a number of law enforcement forensics tools are already doing this, and can be purchased (or pirated) by almost anybody, regardless of whether or not you’re actually law enforcement. Stealing something you know is much more difficult, which is why I suspect Apple has a backup password to begin with. Unfortunately, it doesn’t do what consumers think it does (and may not even do what Apple’s security team thought it did).
In addition to the technical threats, socially engineering your way around the weakest forms of authentication are made easier in this paradigm; it’s easier to get someone to push the “Trust” button when connected to a malicious piece of hardware (such as a hotel alarm clock, or an embedded device someone slips into the lighter adapter of your car) than it is to fool someone into typing in their backup password. So unlike OS X, where you use layers of strong, password-based encryption to protect the weaker forms of authentication (such as simpler password or OAuth), iOS works backwards, where the PIN can bypass the pairing authentication, and the pairing authentication can (thanks to this service) bypass the backup encryption password. This is completely backwards, and the entire hierarchy falls apart with four simple digits, or one file because of this.
Good security practices that include threat modeling typically include understanding your adversary. Adversaries generally tend to take the “path of least resistance” to your data. Using weaker authentication mechanisms to control stronger ones gives your adversary a much larger attack surface to consider, allowing them to gain control of the otherwise more secure services protected by more complex forms of authentication. While iOS’ security has become much better than it used to be, the trade-offs that Apple has made for usability have somehow led us to a backwards authentication hierarchy, which leaves us with an authentication scheme that can be described as, “it’s complicated”.
When I wrote my paper, “Identifying Backdoors, Attack Points, and Surveillance Mechanisms in iOS Devices”, I attempted to deconstruct leaked documents outlining NSA collection of iPhone data. The attacks outlined in the NSA documents specifically made use of the target’s desktop computer in order to effect further access into “38 features” on an iPhone. The goal was to explain how iOS itself could have been abused in the fashion described in the documents. What the actual techniques were, we’ll never know, and I’m sure that the intelligence community has since moved on to much more deeply-nested exploits since. Having worked with a number of intelligence researchers, however, I do believe the core competency at the time would have been consistent with the attacks I described in my paper, all things being equal. If the Occam’s Razor “path of least resistance” philosophy holds, these undisclosed Apple services were very likely used, at least in part, and I posit a number of reasons why in the paper. Apple would do well to shut down the file relay, or guarantee that backup encryption is no longer bypassed if the service is to hang around. They would do much better, however, to correct the design flaws in their authentication hierarchy to support the more secure paradigm they’ve shown in their most excellent OS X operating system.
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |