Apple has long enjoyed a security architecture whose security, in part, rests on the entanglement of their encryption to a device’s physical hardware. This pairing has demonstrated to be highly effective at thwarting a number of different types of attacks, allowing for mobile payments processing, secure encryption, and a host of other secure services running on an iPhone. One security feature that iOS lacks for third party developers is the ability to validate the hardware a user is on, preventing third party applications from taking advantage of such a great mechanism. APIs can be easily spoofed, as a result, and sessions and services are often susceptible to a number of different forms of abuse. Hardware validation can be particularly important when dealing with crowd-sourced data and APIs, as was the case a couple years ago when a group of students hacked Waze’s traffic intelligence. These types of Sybil attacks allow for thousands of phantom users to be created off of one single instance of an application, or even spoof an API altogether without a connection to the hardware. Other types of MiTM attacks are also a threat to applications running under iOS, for example by stealing session keys or OAuth tokens to access a user’s account from a different device or API. What can Apple do to thwart these types of attacks? Hardware entanglement through the Secure Enclave.
The Secure Enclave (SEP) performs a number of functions at the core of hardware entanglement, such as the passcode entanglement function used by Apple’s encryption scheme. The SEP has direct hooks into the encryption routines embedded on silicon, and prevents a number of cryptanalytic attacks by binding the encryption to the hardware. It also has its own encrypted storage. The SEP could be used to perform digital signing on behalf of applications in a similar fashion, and this digital signing can help to authenticate a third party session. So how could this work?
Here’s one example: A public / private key pair identity could be generated on devices and embedded in the SEP. This key pair would be signed by an Apple key and would represent a device’s identity (it, too, could be entangled with the device’s UID to protect the private key). When establishing a new session, this device’s public key is sent to the server (for example, Waze’s servers). The public key’s signature is verified with an Apple public key, which authenticates the device as a genuine Apple device. A challenge including a session key, a nonce, and a timestamp, is encrypted using the device’s public key and then sent to the device using whatever transport encryption on top of that. The third party application uses an API to pass the encrypted challenge to the Secure Enclave, which decrypts it and returns the decrypted content back to the application. The application then uses the session key to establish a session with the server, signs the challenge containing the nonce and timestamp, and returns it back to the server. The server verifies the signature and the session can commence. Additional challenges can be made at any time, however the initial session’s key can be trusted because only the Secure Enclave on the owner’s device could decrypt it (and authenticate the challenge).
Apple does already have the infrastructure to do something like this within the SEP, and it’s used to better protect Touch ID authentication and to manage enrollments for third party apps; it is not, however, signed by Apple, and so there doesn’t appear to be any way to authenticate the hardware, which is at the core of how a Sybil attack could be thwarted.
If Apple were concerned about device tracking, they could further abstract this by generating new key pairs when advertising ids are reset, for example, although the chain of trust would have to be tweaked to accommodate this.
There are certainly a number of other, similar dances that Apple could use to establish hardware-entangled sessions. The idea here is to use the Secure Enclave for digital signing as well as for PKI when establishing a new session, and authenticating the PKI using an Apple cert to sign device identities. This would effectively tie a remote session to a device, allowing the server to be more resilient against Sybil attacks and certain forms of MiTM attacks.
Apple would be the only ones that could effectively implement this into their hardware architecture. The result, however, would be very beneficial to Apple in that it would create a more trusted platform for third party developers to build on, more robust remote sessions, and an overall more secure experience for the user.