M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |
Recently, FBI got a court order that compels Apple to create a forensics tool; this tool would let FBI brute force the PIN on a suspect’s device. But lets look at the difference between this and simply bringing a phone to Apple; maybe you’ll start to see the difference of why this is so significant, not to mention underhanded.
First, let me preface this with the fact that I am speaking from my own personal experience both in the courtroom and working in law enforcement forensics circles since around 2008. I’ve testified as an expert in three cases in California, and many others have pleaded out or had other outcomes not requiring my testimony. I’ve spent considerable time training law enforcement agencies around the world specifically in iOS forensics, met LEOs in the middle of the night to work on cases right off of airplanes, gone through the forensics validation process and clearance processes, and dealt with red tape and all the other terrible aspects of forensics that you don’t see on CSI. It was a lot of fun but was also an incredibly sobering experience, as I have not been trained to deal with the evidence (images, voicemail, messages, etc) that I’ve been exposed to like LEOs have; my faith has kept me grounded. I’ve developed an amazing amount of respect for what they do.
For years, the government could come to Apple with a warrant and a phone, and have the manufacturer provide a disk image of the device. This largely worked because Apple didn’t have to hack into their phones to do this. Up until iOS 8, the encryption Apple chose to use in their design was easily reversible when you had code execution on the phone (which Apple does). So all through iOS 7, Apple only needed to insert the key into the safe and provide FBI with a copy of the data.
This service worked like a “black box”, and while Apple may have needed to explain their methods in court at some point, they were more likely considered a neutral third party lab as most forensics companies would be if you sent them a DNA sample. The level of validation and accountability here is relatively low, and methods can often be opaque; that is, Apple could simply claim that the tech involved was a trade secret, and gotten off without much more than an explanation. An engineer at Apple could hack up a quick and dirty tool to dump disk, and nobody would need to ever see it because they were providing a lab service and were considered more or less trade secrets.
Now lets contrast that history with what FBI and the courts are ordering Apple to do here. FBI could have come to Apple with a court order stating they must brute force the PIN on the phone and deliver the contents. It would have been difficult to get a judge to sign off on that, since this quite boldly exceeds the notion of “reasonable assistance” to hack into your own devices. No, to slide this by, FBI was more clever. They requested that Apple developed a forensics tool but did not do the actual brute force themselves.
This was apparently enough for the courts to look past the idea of “reasonable assistance”, however there are some unseen caveats that are especially dangerous here. What many haven’t considered is the significant difference – in the legal world – between providing lab services and developing what the courts will consider an instrument.
An instrument is the term used in the courts to describe anything from a breathalyzer device to a forensics tool, and in order to get judicial notice of a new instrument, it must be established that it is validated, peer reviewed, and accepted in the scientific community. It is also held to strict requirements of reproducibility and predictability, requiring third parties (such as defense experts) to have access to it. I’ve often heard Cellebrite referred to, for example, as “the Cellebrite instrument” in courts. Instruments are treated very differently from a simple lab service, like dumping a phone. I’ve done both of these for law enforcement in the past: provided services, and developed a forensics tool. Providing a simple dump of a disk image only involves my giving testimony of my technique. My forensics tools, however, required a much thorough process that took significant resources, and they would for Apple too.
The tool must be designed and developed under much more stringent practices that involve reproducible, predictable results, extensive error checking, documentation, adequate logging of errors, and so on. The tool must be forensically sound and not change anything on the target, or document every change that it makes / is made in the process. Full documentation must be written that explains the methods and techniques used to disable Apple’s own security features. The tool cannot simply be some throw-together to break a PIN; it must be designed in a manner in which its function can be explained, and its methodology could be reproduced by independent third parties. Since FBI is supposedly the ones to provide the PIN codes to try, Apple must also design and develop an interface / harness to communicate PINs into the tool, which means added engineering for input validation, protocol design, more logging, error handling, and so on. FBI has asked to do this wirelessly (possibly remotely), which also means transit encryption, validation, certificate revocation, and so on.
Once the tool itself is designed, it must be tested internally on a number of devices with exactly matching versions of hardware and operating system, and peer reviewed internally to establish a pool of peer-review experts that can vouch for the technology. In my case, it was a bunch of scientists from various government agencies doing the peer-review for me. The test devices will be imaged before and after, and their disk images compared to ensure that no bits were changed; changes that do occur from the operating system unlocking, logging, etc., will need to be documented so they can be explained to the courts. Bugs must be addressed. The user interface must be simplified and robust in its error handling so that it can be used by third parties.
Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify. NIST checks to ensure that all of the data on the test devices is recovered. Any time the software is updated, it should go back through the validation process. Once NIST tests and validates the device, it would be clear for the FBI to use on the device. Here is an example of what my tools validation from NIJ looks like: https://www.ncjrs.gov/pdffiles1/nij/232383.pdf
During trial, the court will want to see what kind of scientific peer review the tool has had; if it is not validated by NIST or some other third party, or has no acceptance in the scientific community, the tool and any evidence gathered by it could be rejected.
Apple must be prepared to defend their tool and methodology in court; no really, the defense / judge / even juries in CA will ask stupid questions such as, “why didn’t you do it this way”, or “is this jail breaking”, or “couldn’t you just jailbreak the phone?” (i was actually asked that by a juror in CA’s broken legal system that lets the jury ask questions). Apple has to invest resources in engineers who are intimately familiar with not only their code, but also why they chose the methodology they did as their best practices. If certain challenges don’t end well, future versions of the instrument may end up needing to incorporate changes at the request of FBI.
If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.
In the likely event that FBI compels the use of the tool for other devices, Apple will need to maintain engineering and legal staff to keep up to date on their knowledge of the tool, maintain the tool, and provide testimony as needed.
In other words, developing an instrument is far more involved than simply dumping a phone for FBI, which FBI could have ordered:
The risks are significant too:
This far exceeds the realm of “reasonable assistance”, especially considering that Apple is not a professional forensics company and has no experience in designing forensic methodology, tools, or forensic validation. FBI could attempt to circumvent proper validation by issuing a deviation (as they had at one point with my own tools), however this runs the risk of causing the house of cards to collapse if challenged by a defense attorney.
So in light of the demand of sound forensic science, the Department of Justice’s outrageous arguments seem quite inaccurate.
Quite the contrary, unless Department of Justice is asking Apple to completely ignore sound forensic science, and simply pump out a reckless (and possibly harmful) hacking tool, it would seem that false statements are being made to the court. Or perhaps they’re attempting to skirt the reality of this by using the verbiage, “after its purpose”, which requires disseminating it outside of Apple, as well as opening it up to work on other devices, and thereby relinquishing custody of it.
In the same vein, you’ll also notice that in demanding a tool, FBI has sneakily ensured that a more “open” copy of the software will have to be released (that will work on other devices) in order for it to be tested, validated, and re-tested by a defense team. This guarantees that the hacking tool FBI is forcing Apple to write will be out in the public, where it will be in the hands of multiple agencies and private attorneys.
Not only is Apple being ordered to compromise their own devices; they’re being ordered to give that golden key to the government, in a very roundabout sneaky way. What FBI has requested will inevitably force Apple’s methods out into the open, where they can be ingested by government agencies looking to do the same thing. They will also be exposed to private forensics companies, who are notorious for reverse engineering and stealing other people’s intellectual property. Should Apple comply in providing a tool, it will inevitably end up abused and in the wrong hands.
But will this case ever need to see a court room? Absolutely, they’ve already admitted they’re following leads and looking at (or at lest for) other people. If a relative or anyone else involved is prosecuted, these tools will come up in court. Outside of this one case, you’re no doubt aware of the precedent this sets, and the likelihood this tool won’t be used once, but many times, each having to establish courtroom acceptance in different jurisdictions, different defense challenges, giving the software to more parties for analysis and reproducible results, and so on.
You’re asking the wrong question. Consider this, even if a suspect never went to court, we’re talking about practicing sound forensic science. Everything I’ve outlined in this article is consistent with best practices in the field. For anyone to be okay with a simple ugly hack job instead of a forensics tool would set an ugly precedent of skirting sound science and methodology in handling of evidence. This would undoubtedly do damage to the reputation of the forensic process, and lower the bar on all such standards. In other words, the reputation of forensic science is more important than whether or not this case will ever see a courtroom.
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |