When I originally gave my talk, it was to a small room of hackers at a hacker conference with a strong privacy theme. With two hours of content to fit into 45 minutes, I not only had no time to demo a POC, but felt that demonstrating a POC of the personal data you could extract from a locked iOS device might be construed as attempting to embarrass Apple or to be sensationalist. After the talk, I did ask a number of people that I know attended if they felt I was making any accusations or outrageous statements, and they told me no, that I presented the information and left it to the audience to draw conclusions. They also mentioned that I was very careful with my wording, so as not to attempt to alarm people. The paper itself was published in a reputable forensics journal, and was peer-reviewed, edited, and accepted as an academic paper. Both my paper and presentation made some very important security and privacy concerns known, and the last thing I wanted to do was to fuel the fire for conspiracy theorists who would interpret my talk as an accusation that Apple is working with NSA. The fact is, I’ve never said Apple was conspiring secretly with any government agency – that’s what some journalists have concluded, and with no evidence mind you. Apple might be, sure, but then again they also might not be. What I do know is that there are a number of laws requiring compliance with customer data, and that Apple has a very clearly defined public law enforcement process for extracting much the same data off of passcode-locked iPhones as the mechanisms I’ve discussed do. In this context, what I deem backdoors (which Apple claims are for their own use), attack points, and so on become – yes suspicious – but more importantly abuse-prone, and can and have been used by government agencies to acquire data from devices that they otherwise wouldn’t be able to access with forensics software. As this deals with our private data, this should all be very open to public scrutiny – but some of these mechanisms had never been disclosed by Apple until after my talk.
I know this, not only because it can be found in a number of different forensics software packages out there, but because I’ve also trained and assisted a number of law enforcement agencies in iOS forensics, have worked closely (but selectively) with government and military to help with certain important cases and projects, and I’ve testified as an expert in court against murderers and other criminals. I’ve been very forthcoming about my training of government and military since around 2008, and have trained internationally in Canada, the UK, and here in the US. I’m well aware of the tools and techniques people are using in the field – and I’m not even saying there’s anything wrong with tapping these services. As an LE agency or forensics software manufacturer, you’re going to use every technique available to access evidence on a device. What is wrong, however, is the fact that the public had no idea they existed on the device. That these services exist at all, and are poorly protected on the device that they can be used and abused, deserves to be under public scrutiny until it’s addressed by Apple. If Apple was not aware that these services were being used in this context, then they are now.
A hidden entrance to a computer system that can be used to bypass security policies (MS definition).
An undocumented way to get access to a computer system or the data it contains.
A way of getting into a guarded system without using the required password.
The mechanism I demo in the POC below (file relay) meets all of these definitions, and my use of the term backdoor has been consistent with this definition since my original paper and talk. In fact, in my talk I was very careful to even refer to file relay’s 44 data services as “undocumented services”, even though they qualify as backdoors by this definition. The service was finally disclosed by Apple after my presentation, and after Apple had denied putting any backdoors into their products. Other services I’ve outlined are not necessarily backdoors, but great attack points due to their sloppy engineering. For example, pcapd, which was originally intended only for developers, but hasn’t been moved to the developer image – so it can be activated on every iOS device Apple has sold (even when not in developer mode). The lack of access controls on the house arrest service can also be used to obtain privileged stateful application data off the device, which I’ll also demonstrate.
In spite of my warnings to the media (via email and telephone inquiries) not to pitch this as a conspiracy theory, they have still managed to completely derail the original intention of this research, and so I think a quick proof-of-concept video will help to clear up any misunderstandings about what this technique can and can’t do. I’ve also outlined the threat models that will and won’t work for this attack. My goal has been to identify a number of unnecessary, undisclosed services that are running on the device that can be abused in many attack scenarios to bypass user encryption to acquire personal data from the device. It is NOT my goal to embarrass Apple or to sensationalize on these techniques at all; these interfaces have been around for quite awhile in iOS, as I outlined in my slides. There is open source software available to talk to them (see: libimobiledevice.org). The reason I gave the talk, however, is that they seem to have run amok lately and the number of services providing personal data (now up to around 44) has become downright personal. Both Apple and the general public should completely aware of the security issues this presents.
So with this in mind, I invite you to have a look at the quick and very basic POC video I’ve put together to show this technique, and why Apple needs to better secure it. Notice a few things from this video:
1. This is an iPhone 5s running 7.1.2
2. The backup encryption checkbox is turned ON, and a password is set