Apple’s new policy about law enforcement is ruffling some feathers with FBI, and has been a point of debate among the rest of us. It has become such because it’s been viewed as just that – a policy – rather than what it really is, which is a design change. With iOS 8, Apple has finally brought their operating system up to what most experts would consider “acceptable security”. My tone here suggests that I’m saying all prior versions of iOS had substandard security – that’s exactly what I’m saying. I’ve been hacking on the phone since they first came out in 2007. Since the iPhone first came out, Apple’s data security has had a dismal track record. Even as recent as iOS 7, Apple’s file system left almost all user data inadequately encrypted (or protected), and often riddled with holes – or even services that dished up your data to anyone who knew how to ask. Today, what you see happening with iOS 8 is a major improvement in security, by employing proper encryption to protect data at rest. Encryption, unlike people, knows no politics. It knows no policy. It doesn’t care if you’re law enforcement, or a criminal. Encryption, when implemented properly, is indiscriminate about who it’s protecting your data from. It just protects it. That is key to security.
Up until iOS 8, Apple’s encryption didn’t adequately protect users because it wasn’t designed properly (in my expert opinion). Apple relied, instead, on the operating system to protect user data, and that allowed law enforcement to force Apple to dump what amounted to almost all of the user data from any device – because it was technically feasible, and there was nobody to stop them from doing it. From iOS 7 and back, the user data stored on the iPhone was not encrypted with a key that was derived from the user’s passcode. Instead, it was protected with a key derived from the device’s hardware… which is as good as having no key at all. Once you booted up any device running iOS 7 or older, much of that user data could be immediately decrypted in memory, allowing Apple to dump it and provide a tidy disk image to the police. Incidentally, it also allowed a number of hackers (including criminals) to read it.
When Apple fixed their poor encryption standards in iOS 8, they fixed what I viewed as a major flaw in the security of their device. Devices that not only you and I use, but also used by our President, our military, public officials, high profile individuals (such as celebrities and CEOs), and diplomats from other countries too. These devices are in the hands of world leaders and military – so there’s really no question that it has to be military grade. This is no longer just a product for hipsters. The fact that I could charge thousands per head to train a diplomatic security team about iOS forensics is evidence to the critical and urgent need for security on the device by our world leaders. If you had any idea just how many physical hardware modifications have gone into devices that public officials are allowed to use, you’d realize that governments know better than to trust prior versions of iOS.
So Apple fixed their security – so what? Well, they fixed it right… and that means that they fixed it so they, themselves, couldn’t break into it… which is the only way to do encryption right. They can’t break into their own phones, at least without using a password breaking tool. That is significant. So in fixing their security, Apple has now said to law enforcement, “we’re sorry, but we’d have to perform sophisticated attacks against our own products in order to even have a chance at dumping data for you.” What they haven’t said, but is very much also the truth, is “we’ve made our products secure enough so that we can’t even hack them … and can keep you safe from criminals, keep our public officials safe from spy agencies, and can keep our military safe from foreign governments – all looking to spy on, eavesdrop on, steal data from, and learn crucial intelligence to harm America (insert any other country here)”.
So who are our enemies and just how high-tech are they? China and Russia are two of the biggest technically capable countries that are hostile towards the US, and have massive intelligence agencies just like the NSA, whose job is to hack our military, diplomats, officials, corporate executives, and anyone else they think is important. There are many others, and the threat is very real. You never hear about most of the Charlie Millers of the world, because they’re too busy hacking foreign dignitaries. Many of those guys work for other countries and hack us and our iPhones.
I have no doubts that the FBI (and other agencies) have used the data they used to be able to dump to solve crimes. In fact, I know this to be true. I’ve helped them do this a number of times in the past myself; when my forensics techniques came out, they issued a major deviation within their agency just to allow their agency to use my tools. As early as 2008, I was working with the lab director of one of the FBI’s regional computer forensics labs, who sent me an email citing these two specific instances where my tools were crucial:
As far as cases:
1) Received a Iphone on a terrorism matter. The phone contained photos of unknown associates, sms messages of value and phonebook information of value. Because of the iphone, law enforcement was able to identify previous unknown terrorists.
2) On a Child assault case, the subject, a friend of the family’s, assaulted a sleeping 13yr female. He took pictures of the entire incident with his iphone. After the assault he was nervous and threw his phone in a dumpster. The iphone was recovered from the dump. After charging, all the pictures were recovered. The victim did not report the incident for a couple of weeks, there was no physical evidence. The only evidence is the victim’s testimony and the iPhone.
So you must be thinking, “well then why are you against Apple helping law enforcement?”. You clearly misunderstand the conversation. This isn’t about choosing to help law enforcement. I’ve done that for years. This is about the idea of embedding backdoors into technology in order to help law enforcement. We’re not talking about Apple choosing whether or not to turn over data to law enforcement that they are in possession of. We are talking about Apple choosing whether or not to insert a backdoor into their own products in order to obtain data that they currently do not possess. We are talking about weakening the security of Apple’s products now that they have finally pulled their heads out and gotten security right for the first time. These poor decisions in security affect not only the criminals and terrorists in that email, but they affect everything ranging on up to our national security.
This, I am against. This is what going too far looks like. What I did early on in my career in the forensics world to help law enforcement was hacking. I learned how to hack into devices and circumvent operating system mechanisms to get law enforcement the data. It wasn’t about backdoors, it was about the cat-and-mouse game of beating the technology. Hacking actually helps to improve the security of a product, when done by the right people… it exposes vulnerabilities (see some of my recent papers), which often leads to a fix, and improved overall security. This is a far cry from what Apple was doing (which I don’t even consider hacking), and that was using their own skeleton key to the iPhone to dump data for the cops. This skeleton key is one of the very things that had weakened security so much that I was able to get in there in the first place through hacking! Certainly if I could do it, some Russian hackers getting bankrolled by Putin’s intelligence agencies could do it (they have better toys).
Today, there are a number of commercial solutions on the market that can penetrate and dump older iPhone models. Depending on the firmware version and hardware model, your mileage varies from, “some of the user data” to “everything on the device”. These commercial “law enforcement” tools are sold to any country with a fat wallet. They are also sold to non-law enforcement entities such as large businesses. They are also sold to anyone who has the money to buy them, and are pirated widely among the criminal community too.
Now consider that criminals, such as the ones who released all of those stolen nude images of celebrities, have the same law enforcement forensics tools that our police do… and all of this made possible due to the weakened state of security that previous versions of iOS have been in? Is it a good thing? On the one hand, we can catch a child predator who only ever used his phone and somehow left no online trace. On the other hand, dozens of people’s privacy are violated in the most intimate way. Some of the victims’ GPS data were pulled and circulated from their stolen files, among sex websites, so that sexual predators could physically stalk these women… so we’re now talking about more than the fappening; we’re talking about real sexual assaults possibly taking place against these women. Have we really improved anything by having weak security?
There is plenty of what people consider “good” resulting from being able to dump data from the iPhone. In my seven years working in iOS forensics, there’s also a lot of bad – real crime that has resulted from Apple’s past design decisions. For this, I’m fumed at the old Apple, for letting their ignorance (or arrogance, depending) help, by design, make certain crimes so easy to pull off.
We’ve also seen, first hand, through the Snowden leaks and other leaks, just how abusive our own government is towards its own people, and how little certain agencies regard our constitutional rights. While I have great respect for many in law enforcement, I’ve also seen that position of authority abused in a number of ways. Sure, it’s a small percentage of bad apples, and most cops I’ve worked with are upstanding guys… but it still happens.
Consider now that laptops have full disk encryption, and are protected often with complex passwords… law enforcement (to my knowledge) doesn’t have a backdoor, and unless they can find a password, they might as well throw the drive out as evidence. Yet cases are still getting solved. Murderers and rapists are still getting convicted. There is such a mount of peripheral evidence out there that only a small handful of cases are even likely to have the iPhone be the sole smoking gun to begin with. Cops have iCloud data, iCloud backups, call records, voicemail records, text messages from the carrier (if obtained within a certain retention period), gmail, email, web logs, trap and trace, proxy logs, not to mention copies of data from other people involved or from the victims themselves, desktop backups (if available), sometimes even a desktop (as many criminals don’t use encryption at all). Add to that they’re eavesdropping on the whole damn Internet. There is a mountain of data with which to pursue and investigate most cases. While it has happened, it’s been very rare that I’ve consulted on a case where the data on the iPhone has been the sole smoking gun.
There’s been an air of laziness in the forensics community because of the great technical capabilities that LE has been given over the past several years. Many forensic examiners themselves admit that they think others in their field are lazy. You don’t have to do any police work in a lot of cases…just push a button, and find the penis. Literally, I’ve known some examiners who have called their job a game of “find the penis”. How sad.
Many of these cases will still get solved with good old fashioned detective work. In fact, the loss of these capabilities, for at least some examiners, is more like pulling a toy away from a child so that they have no choice but to get up and read a book. Much of the FBI’s crying is, in my opinion, complaining about the fact that they’ll have to work harder to do their job. Fine. These guys used to kick ass at what they did – without tools like this. I’m all for getting some of the fat ones who’ve spent too much time behind a desk back on the treadmill and out in the field.
We know it’s technically possible to install back doors … the real question is what level of privacy are you willing to give up, just to assist law enforcement? Should we allow the police to control every webcam in our house? Should we install cameras in our bedroom and allow law enforcement to record everything that goes on in every house, and only play it back if there’s a crime reported? (Do you really trust them that much? Because guys I know wouldn’t trust their fellow detectives/agents that much). Should we force glove manufacturers at Macy’s to install latent fingerprint contact paper inside all of the fingers to their gloves, so that criminals can’t conceal their fingerprints when committing a crime? There are certainly an infinite number of things that we can do to destroy our own privacy just for the sake of helping law enforcement with their cases. Unless you’re an insane human being, none of these sounds good to you. You probably understand that back-dooring your webcams also means that a criminal could find his way in (and many have). You probably understand that having a camera in your bedroom will lend you no privacy, and force you to live in fear of being embarrassed. You probably understand that it makes no sense to ban gloves, just for the sake of exposing everyone’s fingerprints to the cops. Nor does it make sense to give the cops all your safe combinations, keys to your house, or any other of the analogies that work here.
These very same arguments are all identical to the argument we are having now about security on the iPhone. The FBI is not asking for Apple to “cooperate” in investigations. The FBI, whether they realize it or not, is asking Apple to back-door their own products so that Apple can still let them in, and so that they don’t have to work so hard to do their jobs. Now there are some great guys at FBI, working hard – they are the ones who aren’t freaking out right now, because they are comfortable with their skill set, and know they can continue to beat the bad guy. The rest of the bureau could stand to learn that they’re not entitled to everything they ask for just to make their job easier.
FBI is probably asking Apple to be willing to brute force their own PIN codes, or perhaps run a dictionary attack against passcodes, to break the encryption on their own devices. There’s really no other way to help the FBI at this point, other than to either back-door the iPhone, or to weaken their own encryption enough so that it can be brute forced / attacked. Both of these choices hurt your privacy, and they hurt our national security. It would also be a disaster for Apple PR for us to learn that Apple is attacking their own customers’ data.
While perhaps a heartfelt utopian ideal, the fact is that demanding back doors for law enforcement is selfish. It’s selfish in that they want a backdoor to serve their own interests. Non law-enforcement types, such as Orin Kerr, a reporter who wrote a piece in the Washington Post supporting FBI backdoors (and then later changed his mind), are being selfish by demanding that others give up their privacy to make them feel safer. This is the absolute opposite of a society where law enforcement serves the public interest. What Kerr, and anyone supporting law enforcement back doors, really wants, is a society that caters to their fears at the expense of others’ privacy.
While we individually choose to trust the law enforcement we come in contact with, government only works if we inherently and collectively distrust it on a public level. Our public policies and standards should distrust those we have put in a position to watch over us. Freedom only works when the power is balanced toward the citizens; providing the government with the ability to choose to invade our 1st, 4th and 5th Amendment rights is only an invitation to lose control of our own freedom. Deep inside this argument is not one of public safety, but a massive power grab away from the people’s right to privacy. Whether everyone involved realizes that, it is privacy itself that is at stake.
Our founding fathers were aware that distrusting government was essential to freedom, and that’s why they used encryption. In fact, because of their own example in concealing correspondence, one can make an even stronger case supporting encryption as an instrument of free speech. The constitution is the highest law of the land – it’s above all other laws. Historically, our founding fathers guarded all instruments available that protect our freedom as beyond the law’s reach: The Press, Firearms, Assembly. These things provided information, teeth, and consensus. Modern encryption is just as essential to our freedom as a country as firearms, and are the teeth that guarantees our freedom of speech and freedom from fear to speak and communicate.
Encryption today still just as vital to free speech as it was in the 1700s, and to freedom itself. What’s at stake here is so much bigger than solving a crime. The excuse of making us safer has been beaten to death as a means to take away freedoms for hundreds of years. Don’t be so naive to think that this time, it’s any different.