Month: April 2016

WSJ Describes Reckless Behavior by FBI in Terrorism Case

The Wall Street Journal published an article today citing a source at the FBI is planning to tell the White House that “it knows so little about the hacking tool that was used to open terrorist’s iPhone that it doesn’t make sense to launch an internal government review”. If true, this should be taken as an act of recklessness by the FBI with regards to the Syed Farook case: The FBI apparently allowed an undocumented tool to run on a piece of high profile, terrorism-related evidence without having adequate knowledge of the specific function or the forensic soundness of the tool.

Best practices in forensic science would dictate that any type of forensics instrument needs to be tested and validated. It must be accepted as forensically sound before it can be put to live evidence. Such a tool must yield predictable, repeatable results and an examiner must be able to explain its process in a court of law. Our court system expects this, and allows for tools (and examiners) to face numerous challenges based on the credibility of the tool, which can only be determined by a rigorous analysis. The FBI’s admission that they have such little knowledge about how the tool works is an admission of failure to evaluate the science behind the tool; it’s core functionality to have been evaluated in any meaningful way. Knowing how the tool managed to get into the device should be the bare minimum I would expect anyone to know before shelling out over a million dollars for a solution, especially one that was going to be used on high-profile evidence.

A tool should not make changes to a device, and any changes should be documented and repeatable. There are several other variables to consider in such an effort, especially when imaging an iOS device. Apart from changes made directly by the tool (such as overwriting unallocated space, or portions of the file system journal), simply unlocking the device can cause the operating system to make a number of changes, start background tasks which could lead to destruction of data, or cause other changes unintentionally. Without knowing how the tool works, or what portions of the operating system it affects, what vulnerabilities are exploited, what the payload looks like, where the payload is written, what parts of the operating system are disabled by the tool, or a host of other important things – there is no way to effectively measure whether or not the tool is forensically sound. Simply running it against a dozen other devices to “see if it works” is not sufficient to evaluate a forensics tool – especially one that originated from a grey hat hacking group, potentially with very little actual in-house forensics expertise.

Read More

Hardware-Entangled APIs and Sessions in iOS

Apple has long enjoyed a security architecture whose security, in part, rests on the entanglement of their encryption to a device’s physical hardware. This pairing has demonstrated to be highly effective at thwarting a number of different types of attacks, allowing for mobile payments processing, secure encryption, and a host of other secure services running on an iPhone. One security feature that iOS lacks for third party developers is the ability to validate the hardware a user is on, preventing third party applications from taking advantage of such a great mechanism. APIs can be easily spoofed, as a result, and sessions and services are often susceptible to a number of different forms of abuse. Hardware validation can be particularly important when dealing with crowd-sourced data and APIs, as was the case a couple years ago when a group of students hacked Waze’s traffic intelligence. These types of Sybil attacks allow for thousands of phantom users to be created off of one single instance of an application, or even spoof an API altogether without a connection to the hardware. Other types of MiTM attacks are also a threat to applications running under iOS, for example by stealing session keys or OAuth tokens to access a user’s account from a different device or API. What can Apple do to thwart these types of attacks? Hardware entanglement through the Secure Enclave.

Read More

Open Letter to Congress on Encryption Backdoors

To the Honorable Congress of the United States of America,

I am a proud American who has had the pleasure of working with the law enforcement community for the past eight years. As an independent researcher, I have assisted on numerous local, state, and federal cases and trained many of our federal and military agencies in digital forensics (including breaking numerous encryption implementations). Early on, there was a time when my skill set was exclusively unique, and I provided assistance at no charge to many agencies flying agents out to my small town for help, or meeting with detectives while on vacation. I have developed an enormous respect for the people keeping our country safe, and continue to help anyone who asks in any way that I can.

With that said, I have seen a dramatic shift in the core competency of law enforcement over the past several years. While there are many incredibly bright detectives and agents working to protect us, I have also seen an uncomfortable number who have regressed to a state of “push button forensics”, often referred to in law enforcement circles as “push and drool forensics”; that is, rather than using the skills they were trained with to investigate and solve cases, many have developed an unhealthy dependence on forensics tools, which have the ability to produce the “smoking gun” for them, literally with the touch of a button. As a result, I have seen many open-and-shut cases that have had only the most abbreviated of investigations, where much of the evidence was largely ignored for the sake of these “smoking guns” – including much of the evidence on the mobile device, which often times conflicted with the core evidence used.

Read More

The Dangers of the Burr Encryption Bill

The Burr Encryption Bill – Discussion Draft dropped last night, and proposes legislation to weaken encryption standards for all United States citizens and corporations. The bill itself is a hodgepodge of technical ineptitude combined with pockets of contradiction. I would cite the most dangerous parts of the bill, but the bill in its entirety is dangerous, not just for its intended uses but also for all of the uses that aren’t immediately apparent to the public.

The bill, in short, requires that anyone who develops features or methods to encrypt data must also decrypt the data under a court order. This applies not only to large companies like Apple, but could be used to punish developers of open source encryption tools, or even encryption experts who invent new methods of encryption. Its broad wording allows the government to hold virtually anyone responsible for what a user might do with encryption. A good parallel to this would be holding a vehicle manufacturer responsible for a customer that drives into a crowd. Only it’s much worse: The proposed legislation would allow the tire manufacturer, as well as the scientists who invented the tires, to be held liable as well.

Screen Shot 2016-04-08 at 10.20.12 AM

Read More

An Open Letter to FBI Director James Comey

Mr. Comey,

Sir, you may not know me, but I’ve impacted your agency for the better. For several years, I have been assisting law enforcement as a private citizen, including the Federal Bureau of Investigation, since the advent of the iPhone. I designed the original forensics tools and methods that were used to access content on iPhones, which were eventually validated by NIST/NIJ and ingested by FBI for internal use into your own version of my tools. Prior to that, FBI issued a major deviation allowing my tools to be used without validation, due to the critical need to collect evidence on iPhones. They were later the foundation for virtually every commercial forensics tool to make it to market at the time. I’ve supported thousands of agencies worldwide for several years, trained our state, federal, and military in iOS forensics, assisted hands-on in numerous high-profile cases, and invested thousands of hours of continued research and development for a suite of tools I provided at no cost – for the purpose of helping to solve crimes. I’ve received letters from a former lab director at FBI’s RCFL, DOJ, NASA OIG, and other agencies citing numerous cases that my tools have been used to help solve. I’ve done what I can to serve my country, and asked for little in return.

First let me say that I am glad FBI has found a way to get into Syed Farook’s iPhone 5c. Having assisted with many cases, I understand from firsthand experience what you are up against, and have enormous respect for what your agency does, as well as others like it. Often times it is that one finger that stops the entire dam from breaking. I would have been glad to assist your agency with this device, and even reached out to my contacts at FBI with a method I’ve since demonstrated in a proof-of-concept. Unfortunately, in spite of my past assistance, FBI lawyers prevented any meetings from occurring. But nonetheless, I am glad someone has been able to reach you with a viable solution.

Read More