How Apple Can Make Their FBI Problems Go Away

An adversary has an unknown exploit, and it could be used on a large scale to attack your platform. Your threat isn’t just your adversary, but also anyone who developed or sold the exploit to the adversary. The entire chain of information from conception to final product could be compromised anywhere along the way, and sold to a nation state on the side, blackmailed or bribed out of someone, or just used maliciously by anyone with knowledge or access. How can Apple make this problem go away?

The easiest technical solution is a boot password. The trusted boot chain has been impressively solid for the past several years, since Apple minimized its attack surface after the 24kpwn exploit in the early days. Apple’s trusted boot chain consists of a multi-stage boot loader, with each phase of boot checking the integrity of the next. Having been stripped down, it is now a shell of the hacker’s smorgasbord it used to be. It’s also a very direct and visible execution path for Apple, and so if there ever is an alleged exploit of it, it will be much easier to audit the code and pin down points of vulnerability.

Apple’s trusted boot process sits in front of the operating system, and can keep the OS dark until proper authentication has been completed. It can also prevent ram disks (containing firmware updates, backdoors, or forensics tools) from booting until the device has been authenticated by the user. If the user were given an optional setting to add a boot password to their device, they could set this boot password to be longer and more complex than their day-to-day screen password. Without first unlocking the boot loader with this password, a device would not be able to boot any firmware – even that signed by Apple (in the event of a successful AWA order some day).

Something like this could be accomplished by encrypting part of the boot chain. Perhaps encrypt the device tree or even part of iBoot itself and keep the key in the Secure Enclave, encrypted with a key derived from the boot password xor’d with the uid, using a pbkdf2 iteration of around 80ms like always. This should prevent brute forcing from being possible, and also prevent direct manipulation of the boot sequence. Ideally, having an extra phase between the low level boot loader and iBoot that would prompt for the boot password, decrypt the boot key, decrypt iBoot, then check signing, load, and execute iBoot would be ideal.

Of course the question is what is the cost to the user experience. Here, it is virtually none. The user is already out of luck if they forget their device passcode; they have to restore the device. The same would be true for a boot password. Users who can navigate the menus to set it up must have at least half a brain to use it in the first place. Should the user forget their boot password, they can be given the option of wiping the device and restoring it back to an unlocked state (this could be done by simply wiping the key from the Secure Enclave, as well as effaceable storage on the device). There would be no bricking of devices here, only a wipe and restore if you forgot your boot password.

Unlike the device’s operating system – which has proven to be vulnerable to dozens of attacks every release cycle, the device’s boot loader has such a stripped down code base that it’s attack surface is the head of a pin compared to the operating system. This would essentially render FBI’s new forensics toy obsolete – no matter what exploit they’re using – because the operating system itself would not even be loaded. With the OS dark, there’s no way of attacking the firmware, or even the Secure Enclave, without first exploiting this trusted boot chain. Did I mention it hasn’t been successfully compromised in years?

Apple could make their FBI problems go away without even needing to know what they’re doing to compromise iOS. It won’t be easy cramming all that extra code into the boot loader, so don’t expect to see something like this tomorrow. It should definitely get onto their list of important things to do, though.

Apple has always been highly protective of the user experience, but having been out for almost ten years, we’re no longer school children with iOS. With adversaries including nation state actors, the balance between a simplistic and pretty user interface must slightly shift toward protecting user data. Especially on devices with such a large footprint of forensic artifacts.