M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 | 29 |
30 | 31 |
As I sit here, trying to be the good parent in reviewing my daughter’s text messages, I have to assume that I’ve made my kids miserable by having a father with DFIR skills. I’m able to dump her device wirelessly using a pairing record and a tool I designed to talk to the phone. I generate a wireless desktop backup, decrypt it, and this somehow makes my kids safer. What bothers me, as a parent, is the incredible trove of forensic artifacts that I find in my children’s data every month. Deleted messages, geolocation information, even drafts and thumbnails that had all been deleted months ago. Thousands of messages sometimes. In 2008, I wrote a small book with O’Reilly named iPhone Forensics that detailed this forensic mess. Apple has made some improvements, but the overall situation has gotten worse since then. The biggest reason the iPhone is so heavily targeted for forensics and intelligence is because of the very wide data recovery surface it provides: it’s a forensics gold mine.
To Apple’s credit, they’ve done a stellar job of improving the security of iOS devices in general (e.g. the “front door”), but we know that they, just like every other manufacturer, is still dancing on the lip of the volcano. Aside from the risk of my device being hacked, there’s a much greater issue at work here. Whether it’s a search warrant, an unlawful traffic stop in Michigan, someone stealing my backups, an ex-lover, or just leaving my phone unlocked on accident at my desk, it only takes a single violation of the user’s privacy to obtain an entire lengthy history of private information that was thought deleted. This problem is also prevalent in desktop OS X.
While Apple’s file system encryption does a good job of ensuring that deleted files actually get deleted (with only a few minor exceptions), most of your private data lives in databases that are never deleted (unless you blow away your device’s data). Your iMessage content, notes, WiFi, location history, call list, contacts, and a lot of the data stored in your third party apps are all kept in a “records” on the device, inside a container (a SQLite database) that doesn’t get deleted. When you delete that iMessage, the message is only flagged for deletion and it hangs out on a free list inside the database – sometimes for months. Even worse, deleting an entire conversation has proven to actually insert new record data leaving forensic trace showing that the conversation was deleted. In short, there’s a big problem with forensics exposure in iOS, and it’s not the file system; it’s the database storage.
This is a problem Apple should address, especially as popular as the iPhone has become, and especially with as much content as is being shoved into these databases by both Apple and third party application manufacturers. So how can Apple fix this? They already invented the solution! The file system encryption in iOS creates a random key for every file on the device, then wraps that key (using something called an aes key wrap) to protect the file record. Those wrapped keys are encrypted with class keys from the device’s keybag, which are controlled by device policies and protected by the user’s passcode. This same design can be ported to SQLite databases in a way that doesn’t chew up the NAND anymore than the file system does. Here’s how it would work:
This allows for SQLite to work the same way it always has, and doesn’t force you to vacuum or overwrite every deleted record: all you have to do is overwrite the key. This is an easy process with SQLite, and could protect every single database record on the device.
Such a technique could be made completely transparent to third party apps, or at least to Apple apps, by integrating it within libsqlite, or it could be made available easily through the iOS SDK. The former would benefit the user’s privacy the most, but may present backward compatibility issues as third party apps don’t have data migrators like Apple apps do. Either way, it’s doable and necessary.
Apple has worked very hard to reduce the iPhone’s attack surface, but they haven’t yet fully addressed the underlying motivations of an attacker (specifically, the device’s forensic value), and that’s left the iPhone a very high value target. This is the oldest, and hardest challenge in the book: making sure that deleted data actually gets deleted. Conversations are ephemeral, but the traces of these conversations are not; this directly impacts how and why search warrants are executed and why mobile devices are targeted by attackers. If the user of the device believes their conversation to be deleted, it’s breaking their trust by keeping forensic traces of those conversations, and ultimately the device’s design can lead to a betrayal of the user’s privacy if data is stolen or a forensic image is made. Ephemeral conversations (or other exchanges) should also mean ephemeral data. This is not a strength of iOS in general.
Shoring up privacy on the iPhone would certainly make my job harder as a parent… but I’d be happy to see that my child’s past conversations, location information, and other private data also isn’t subject to a stalker or a hacker if the information got out. I think that’s a very reasonable trade-off.
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 | 29 |
30 | 31 |