You know the old saying, “shoot first, ask questions later”. It refers to the notion that careless law enforcement officers can often be short sighted in solving the problem at hand. It’s impossible to ask questions to a dead person, and if you need answers, that really makes it hard for you if you’ve just shot them. They’ve just blown their only chance of questioning the suspect by failing to take their training and good judgment into account. This same scenario applies to digital evidence. Many law enforcement agencies do not know how to properly handle digital evidence, and end up making mistakes that cause them to effectively kill their one shot of getting the answers they need.
In the case involving Farook’s iPhone, two things went wrong that could have resulted in evidence being lifted off the device.
First, changing the iCloud password prevented the device from being able to push an iCloud backup. As Apple’s engineers were walking FBI through the process of getting the device to start sending data again, it became apparent that the password had been changed (suggesting they may have even seen the device try to authorize on iCloud). If the backup had succeeded, there would be very little, if anything, that could have been gotten off the phone that wouldn’t be in the iCloud backup.
Secondly, and equally damaging to the evidence, was that the device was apparently either shut down or allowed to drain after it was seized. Shutting the device down is a common – but outdated – practice in field operations. Modern device seizure not only requires that the device should be kept powered up, but also to tune all of the protocols leading up to the search and seizure so that it’s done quickly enough to prevent the battery from draining before you even arrive on scene. Letting the device power down effectively shot the suspect dead by removing any chances of doing the following:
Talking directly to Siri, and asking her to display call records, contacts, email, and other information.
Capturing the network traffic traveling between the device and any providers of third party applications on the device, which could have not only yielded valuable data, but also information about which providers Farook’s phone stored data on (for subpoena).
As Farook’s laptop is being reconstructed, should a pair record be recovered, could have been used to unlock the phone or download the data on it through a backup.
If the iOS was 9.0.1 or lower, a known lock screen bypass bug would have potentially allowed them access to a significant amount of data on the device (data that is unlocked “after first user authentication”)
Dozens of known vulnerabilities exist for older firmware that may have been able to penetrate the device with a PoC, that otherwise couldn’t be used if the encryption is locked. Simply reading Apple’s release notes would have provided contact information for a number of researchers and universities who likely had PoC exploit code they would have loaned to FBI.
Just like good tactical training prevents law enforcement from unnecessarily shooting and killing suspects, poor device seizure practices is like an “accidental discharge” to a device. Apple shouldn’t be forced to undo this mess just as you wouldn’t expect a doctor to be able to raise someone from the dead after you’ve shot and killed them.
What law enforcement needs more of is training; specifically training in digital evidence collection and seizure. Best practices have changed with iOS 8, and devices are best left charged and in a faraday bag so that the encryption remains unlocked and mechanisms such as background refresh and Siri remain active. Seizure protocol should also include any desktop machines that may contain pair records, dusting the device for latent prints, and even collecting or compelling a usable fingerprint before the fingerprint timer runs out. If a subject is shot dead at the scene, there may even be a chance of authenticating the device with their own finger (as morbid as that sounds), but only if the field agents have been trained with and are following such protocols. Even response time and a streamlined warrant process is critical when battery life is at stake. These and other important protocols need to be implemented in agencies; and I don’t see evidence that any of today’s modern best practices were used here.
Many have suggested that Apple locking their device down will compromise national security. Quite the contrary, other agencies, such as the NSA, aren’t concerned about this case at all. Ex-NSA Chief Gen. Michael Hayden has recently supported encryption as “good for America”. NSA isn’t worried about this case because they’re used to compromising targets before someone goes in and shoots the witness. They plan ops so that they can extract the data they need while devices are still accessible and authenticated; the smart way to do it. The elusive exploits, such as the one needed here, are saved for those one-off cases when that’s not possible; they’re certainly not going to burn their exploits on a case that’s been turned into a media circus, where their capabilities on this phone (and likely newer ones) would become made known.
The “necessity” prong of the AWA should clearly not apply when an agency has demonstrated that the capability to obtain the needed data was possible, but was not acquired before the fact through training, hiring, or procurement. That’s at least part of what seems to be happening here. FBI could have had these capabilities, and it would not have exceeded their mandate, however chose not to invest in them. Maybe this is why they have just slated another $85 million in funding to cyber this week.