Recent speculation has been made about a plan to unlock Farook’s iPhone simply so that they can walk through the evidence right on the device, rather than to forensically image the device, which would provide no information beyond what is already in an iCloud backup. Going through the applications by hand on an iPhone is along the dumpster level of forensic science, and let me explain why.
The device in question appears to have been powered down already, which has frozen the crypto as well as a number of processes on the device. While in this state, the data is inaccessible – but at least it’s in suspended animation. At the moment, the device is incapable of connecting to a WiFi network, running background tasks, or giving third party applications access to their own data for housekeeping. This all changes once the device is unlocked. Now when a pin code is brute forced, the task is actually running from a separate copy of the operating system booted into memory. This creates a sterile environment where the tasks on the device itself don’t start, but allows a platform to break into the device. This is how my own forensics tools used to work on the iPhone, as well as some commercial solutions that later followed my design. The device can be safely brute forced without putting data at risk. Using the phone is a different story.
After the device’s pin is deduced, if the agency were to boot the device and use that pin, several things will unlock on it along with the encryption. The most forensically risky thing is that background tasks in applications will start. In iOS 9, applications can run in the background for various tasks, such as background refresh, VoIP, or even basic housekeeping. The mere act of unlocking the device could cause some of the third party applications on the device to run and potentially clean themselves up, destroying old cached data, and even downloading new data. The simplest example would be a social networking application that might refresh its feed, but also remove older feed content that it doesn’t think it needs anymore. This could be disastrous for any evidence sitting on the device.
If an agency walks through the applications on the screen, they run an even stronger risk of completely destroying the evidence contained within these apps. For example, the Wickr and Telegram applications (along with many others) support self-destructing messages. The self-destruct mechanism can kick in by simply launching the app, which would result in the data being wiped from its internal database. This is not only possible, but likely, on any good secure messaging application – which would be the exact kinds of applications you would be interested in.
It is much more forensically sound to create a file system image of the device once it has been unlocked, while the operating system is still dark. This is why I hypothesized in another blog post that FBI would eventually get a court order to force Apple to do this. There is presently no forensic tool on the market capable of imaging the file system of an iOS 8 or 9 device, therefore all anyone would get if they imaged it themselves would be the same data they already have access to via an iCloud backup.
An agency would have to really crawl into the dumpster to think that using the device’s user interface is a viable solution to analyze evidence. Not only does it run the strong risk of destroying the only copy of evidence they might have, but any smart judge would throw out any case using such sloppy techniques.
Sadly, I would not put it past certain arms of law enforcement to do just this. For a year, we tolerated the use of crappy Chinese hardware, such as IP-BOX, which has zero forensic credibility and is literally a black box, to brute force the PINs on iOS 8 devices. In spite of the best practices I teach in my forensics training classes, many departments still practice unsafe methods for seizing iOS devices that includes trying a bunch of PINs, shutting down the device, or even worse – leaving the device on without proper shielding. Some years ago, I listened to an FBI agent publicly discuss how they allowed a suspect to make a phone call, and watched the remote wipe kick in on the evidence as they were holding it in their hand.
Terrorism cases are serious business, but at some point it seems that forensic science has decided to go dumpster diving to solve cases that could otherwise be solved with a little more patience and better methodology. That’s part of what’s going on here in seeing the number of wrong turns this case has already seen.