Month: February 2016

Shoot First, Ask Siri Later

You know the old saying, “shoot first, ask questions later”. It refers to the notion that careless law enforcement officers can often be short sighted in solving the problem at hand. It’s impossible to ask questions to a dead person, and if you need answers, that really makes it hard for you if you’ve just shot them. They’ve just blown their only chance of questioning the suspect by failing to take their training and good judgment into account. This same scenario applies to digital evidence. Many law enforcement agencies do not know how to properly handle digital evidence, and end up making mistakes that cause them to effectively kill their one shot of getting the answers they need.

In the case involving Farook’s iPhone, two things went wrong that could have resulted in evidence being lifted off the device.

First, changing the iCloud password prevented the device from being able to push an iCloud backup. As Apple’s engineers were walking FBI through the process of getting the device to start sending data again, it became apparent that the password had been changed (suggesting they may have even seen the device try to authorize on iCloud). If the backup had succeeded, there would be very little, if anything, that could have been gotten off the phone that wouldn’t be in the iCloud backup.

Secondly, and equally damaging to the evidence, was that the device was apparently either shut down or allowed to drain after it was seized. Shutting the device down is a common – but outdated – practice in field operations. Modern device seizure not only requires that the device should be kept powered up, but also to tune all of the protocols leading up to the search and seizure so that it’s done quickly enough to prevent the battery from draining before you even arrive on scene. Letting the device power down effectively shot the suspect dead by removing any chances of doing the following:

Read More

Apple’s Burden to Protect or Perpetually Create a “Weapon”

As the Apple/FBI dispute continues on, court documents reveal the argument that Apple has been providing forensic services to law enforcement for years without tools being hacked or leaked from Apple. Quite the contrary, information is leaked out of Foxconn all the time, and in fact some of the software and hardware tools used to hack iOS products over the past several years (IP-BOX, Pangu, and so on) have originated in China, where Apple’s manufacturing process takes place. Outside of China, jailbreak after jailbreak has taken advantage of vulnerabilities in iOS, some with the help of tools leaked out of Apple’s HQ in Cupertino. Devices have continually been compromised and even today, Apple’s security response team releases dozens of fixes for vulnerabilities that have been exploited outside of Apple. Setting all of this aside for a moment, however, lets take a look at the more immediate dangers of such statements.

By affirming that Apple can and will protect such a backdoor, Comey’s statement is admitting that Apple will be faced with not only the burden of breathing this forensics backdoor into existence, but must also take perpetual steps to protect it once it’s been created. In other words, the courts are forcing Apple to create what would be considered a weapon under the latest proposed Wassenaar rules, and charging them with the burden of also preventing that weapon from getting out – either the code itself, or the weaknesses that Apple would have to continue allowing to be baked into their products to allow the weapon to work.

Read More

On Ribbons and Ribbon Cutters

With most non-technical people struggling to make sense of the battle between FBI and Apple, Bill Gates introduced an excellent analogy to explain cryptography to the average non-geek. Gates used the analogy of encryption as a “ribbon around a hard drive”. Good encryption is more like a chastity belt, but since Farook decided to use a weak passcode, I think it’s fair here to call it a ribbon. In any case, lets go with Gates’ ribbon analogy.

Where Gates is wrong is that the courts are not ordering Apple to simply cut the ribbon. In fact, I think there would be more in the tech sector who would support Apple simply breaking the weak password that Farook chose to use if this had been the case. Apple’s encryption is virtually unbreakable when you use a strong alphanumeric passcode, and so by choosing to use a numeric pin, you get what you deserve.

Instead of cutting the ribbon, which would be a much simpler task, the courts are ordering Apple to invent a ribbon cutter – a forensic tool capable of cutting the ribbon for FBI, and is promising to use it on just this one phone. In reality, there’s already a line beginning to form behind Comey should he get his way. NY DA Cy Vance has stated that NYC has 175 iPhones waiting to be unlocked (which translates to roughly 1/10th of 1% of all crime in NYC for an entire year). Documents have also shown DOJ has over a dozen more such requests pending. If the promise of “just this one phone” were authentic, there would be no need to order Apple to make this ribbon cutter; they’d simply tell them to cut the ribbon.

Read More

Dumpster Diving in Forensic Science

Recent speculation has been made about a plan to unlock Farook’s iPhone simply so that they can walk through the evidence right on the device, rather than to forensically image the device, which would provide no information beyond what is already in an iCloud backup. Going through the applications by hand on an iPhone is along the dumpster level of forensic science, and let me explain why.

The device in question appears to have been powered down already, which has frozen the crypto as well as a number of processes on the device. While in this state, the data is inaccessible – but at least it’s in suspended animation. At the moment, the device is incapable of connecting to a WiFi network, running background tasks, or giving third party applications access to their own data for housekeeping. This all changes once the device is unlocked. Now when a pin code is brute forced, the task is actually running from a separate copy of the operating system booted into memory. This creates a sterile environment where the tasks on the device itself don’t start, but allows a platform to break into the device. This is how my own forensics tools used to work on the iPhone, as well as some commercial solutions that later followed my design. The device can be safely brute forced without putting data at risk. Using the phone is a different story.

Read More

On FBI’s Interference with iCloud Backups

In a letter emailed from FBI Press Relations in the Los Angeles Field Office, the FBI admitted to performing a reckless and forensically unsound password change that they acknowledge interfered with Apple’s attempts to re-connect Farook’s iCloud backup service. In attempting to defend their actions, the following statement was made in order to downplay the loss of potential forensic data:

 “Through previous testing, we know that direct data extraction from an iOS device often provides more data than an iCloud backup contains. Even if the password had not been changed and Apple could have turned on the auto-backup and loaded it to the cloud, there might be information on the phoen that would not be accessible without Apple’s assistance as required by the All Writs Act Order, since the iCloud backup does not contain everything on an iPhone.”

This statement implies only one of two possible outcomes:

Read More

10 Reasons Farook’s Work Phone Likely Won’t Have Any Evidence

Ten reasons to consider about Farook’s work phone:

Read More

Apple, FBI, and the Burden of Forensic Methodology

Recently, FBI got a court order that compels Apple to create a forensics tool; this tool would let FBI brute force the PIN on a suspect’s device. But lets look at the difference between this and simply bringing a phone to Apple; maybe you’ll start to see the difference of why this is so significant, not to mention underhanded.

First, let me preface this with the fact that I am speaking from my own personal experience both in the courtroom and working in law enforcement forensics circles since around 2008. I’ve testified as an expert in three cases in California, and many others have pleaded out or had other outcomes not requiring my testimony. I’ve spent considerable time training law enforcement agencies around the world specifically in iOS forensics, met LEOs in the middle of the night to work on cases right off of airplanes, gone through the forensics validation process and clearance processes, and dealt with red tape and all the other terrible aspects of forensics that you don’t see on CSI. It was a lot of fun but was also an incredibly sobering experience, as I have not been trained to deal with the evidence (images, voicemail, messages, etc) that I’ve been exposed to like LEOs have; my faith has kept me grounded. I’ve developed an amazing amount of respect for what they do.

For years, the government could come to Apple with a warrant and a phone, and have the manufacturer provide a disk image of the device. This largely worked because Apple didn’t have to hack into their phones to do this. Up until iOS 8, the encryption Apple chose to use in their design was easily reversible when you had code execution on the phone (which Apple does). So all through iOS 7, Apple only needed to insert the key into the safe and provide FBI with a copy of the data.

Read More

Code is Law

For the first time in Apple’s history, they’ve been forced to think about the reality that an overreaching government can make Apple their own adversary. When we think about computer security, our threat models are almost always without, but rarely ever within. This ultimately reflects through our design, and Apple is no exception. Engineers working on encryption projects are at a particular disadvantage, as the use (or abuse) of their software is becoming gradually more at the mercy of legislation. The functionality of encryption based software boils down to its design: is its privacy enforced through legislation, or is it enforced through code?

My philosophy is that code is law. Code should be the angry curmudgeon that doesn’t even trust its creator, without the end user’s consent. Even at the top, there may be outside factors affecting how code is compromised, and at the end of the day you can’t trust yourself when someone’s got  a gun to your head. When the courts can press the creator of code into becoming an adversary against it, there is only ultimately one design conclusion that can be drawn: once the device is provisioned, it should trust no-one; not even its creator, without direct authentication from the end user.

Read More

tl;dr technical explanation of #ApplevsFBI

  • Apple was recently ordered by a magistrate court to assist the FBI in brute forcing the PIN of a device used by the San Bernardino terrorists.
  • The court ordered Apple to develop custom software for the device that would disable a number of security features to make brute forcing possible.
  • Part of the court order also instructed Apple to design a system by which pins could be remotely sent to the device, allowing for rapid brute forcing while still giving Apple plausible deniability that they hacked a customer device in a literal sense.
  • All of this amounts to the courts compelling Apple to design, develop, and protect a backdoor into iOS devices.

Read More