Tl;Dr Notes on iOS 8 PIN / File System Crypto

Here’s iOS file system / PIN encryption as I understand it. I originally pastebin’d this but folks thought it was worth keeping around. (Thanks to Andrey Belenko for his suggestions for edits).

Block 0 of the NAND is used as effaceable storage and a series of encryption “lockers” are stored on it. This is the portion that gets wiped when a device is erased, as this is the base of the key hierarchy. These lockers are encrypted with a hardware key that is derived from a unique hardware id fused into the secure space of the chip (secure enclave, etc). Only the hardware AES routines have access to this key, and there is no known way to extract it without chip deconstruction.

One specific locker, called BAGI, contains an encryption key that encrypts what’s called the system keybag. The keybag contains a number of encryption “class keys” that ultimately protect files in the file system; they’re locked and unlocked at different times, depending on user activity. This lets developers choose if files should get locked when the device is locked, or stay unlocked after they enter their PIN, and so on. Every file on the file system has its own random file key, and that key is encrypted with a class key from the keybag. The keybag keys are encrypted with a combination of the key in the BAGI locker and the user’s PIN.

There’s another locker in the NAND (what Apple calls the class 4 key, and what we call the Dkey). The Dkey is not encrypted with the user PIN, and in previous versions of iOS (<8), was used as the foundation for encryption of any files that were not specifically protected with “data protection”. Most of the file system at the time used the Dkey instead of a class key, by design. Because the PIN wasn’t involved in the crypto (like it is with the class keys in the keybag), anyone with root level access (such as Apple) could easily open that Dkey locker, and therefore decrypt the vast majority of the file system that used it for encryption. The only files that were protected with the PIN up until iOS 8 were those with data protection explicitly enabled, which did not include a majority of Apple’s files storing personal data. In iOS 8, Apple finally pulled the rest of the file system out of the Dkey locker and now virtually the entire file system is using class keys from the keybag that *are* protected with the user’s PIN. The hardware-accelerated AES crypto functions allow for very fast encryption and decryption of the entire hard disk making this technologically possible since the 3GS, however for no valid reason whatsoever (other than design decisions), Apple decided not to properly encrypt the file system until iOS 8.
Continue reading

Running Invisible in the Background in iOS 8

Since iOS 8’s release, a number of security improvements have been made since publishing my findings last July. Many services that posed a threat to user privacy have been since closed off, and are only open in beta versions of iOS. One small point I made in the paper was the threat that invisible software poses on the operating system:

“Malicious software does not require a device be jail- broken in order to run. … With the simple addition of an SBAppTags property to an application’s Info.plist (a required file containing descriptive tags iden- tifying properties of the application), a developer can build an application to be hidden from the user’s GUI (SpringBoard). This can be done to a non-jailbroken device if the attacker has purchased a valid signing certificate from Apple. While advanced tools, such as Xcode, can detect the presence of such software, the application is invisible to the end-user’s GUI, as well as in iTunes. In addition to this, the capability exists of running an application in the background by masquerading as a VoIP client (How to maintain VOIP socket connection in background) or audio player (such as Pandora) by add- ing a specific UIBackgroundModes tag to the same property list file. These two features combined make for the perfect skeleton for virtually undetectable spyware that runs in the background.”

As of iOS 8, Apple has closed off the SBAppTags feature set so that applications cannot use that to hide applications, however it looks like there are still some ways to manipulate the operating system into hiding applications on the device. I have contacted Apple with the specific technical details and they have assured me that the problem has been fixed in iOS 8.3. As for now, however, it looks like iOS 8.2 and lower are still vulnerable to this attack. The attack allows for software to be loaded onto a non-jailbroken device (which typically requires a valid pairing, or physical possession of the device) that runs in the background and invisibly to the SpringBoard user interface.

The presence of a vulnerability such as this should heighten user awareness that invisible software may still be installed on a non-jailbroken device, and would be capable of gathering information that could be used to track the user over a period of time. If you suspect that malware may be running on your device, you can view software running invisibly with a copy of Xcode. Unlike the iPhone’s UI and iTunes, invisible software that is installed on the device will show up under Xcode’s device organizer.

Continue reading

Testing for the Strawhorse Backdoor in Xcode

In the previous blog post, I highlighted the latest Snowden documents, which reveal a CIA project out of Sandia National Laboratories to author a malicious version of Xcode. This Xcode malware targeted App Store developers by installing a backdoor on their computers to steal their private codesign keys.

Screen Shot 2015-03-10 at 2.09.50 PM

So how do you test for a backdoor you’ve never seen before? By verifying that the security mechanisms it disables are working correctly. Based on the document, the malware apparently infects Apple’s securityd daemon to prevent it from warning the user prior to exporting developer keys:

“… which rewrites securityd so that no prompt appears when exporting a developer’s private key”

A good litmus test to see if securityd has been compromised in this way is to attempt to export your own developer keys and see if you are prompted for permission.

Continue reading

The Implications of CIA’s Jamboree

Early this morning, The Intercept posted several documents pertaining to CIA’s research into compromising iOS devices (along with other things) through Sandia National Laboratories, a major research and development contractor to the government. The documents outlined a number of project talks taking place at a closed government conference referred to as the Jamboree in 2012. The projects listed in the documents included the following pieces.


Rocoto, a chip-like implant that would likely be soldered to the 30-pin connector on the main board, and act like a flasher box that performs the task of jailbreaking a device using existing public techniques. Once jailbroken, a chip like Rocoto could easily install and execute code on the device for persistent monitoring or other forms of surveilance. Upon firmware restore, a chip like Rocoto could simply re-jailbreak the device. Such an implant could have likely worked persistently on older devices (like the 3G mentioned), however the wording of the document (“we will discuss efforts”) suggests the implant was not complete at the time of the talk. This may, however, have later been adopted into the DROPOUTJEEP implant, which was portrayed as an operational product in the NSA’s catalog published several months ago. The DROPOUTJEEP project, however, claimed to be software-based, where Rocoto seems to have involved a physical chip implant.


Strawhorse, a malicious implementation of Xcode, where App Store developers (likely not suspected of any crimes) would be targeted, and their dev machines backdoored to give CIA injection capabilities into compiled applications. The malicious Xcode variant was capable of stealing the developer’s private codesign keys, which would be smuggled out with compiled binaries. It would also disable securityd so that it would not warn the developer that this was happening. The stolen keys could later be used to inject and sign payloads into the developer’s own products without their permission or knowledge, which could then be widely disseminated through the App Store channels. This could include trojans or watermarks, as the document suggests. With the developer keys extracted, binary modifications could also be made at a later time, if such an injection framework existed.

In spite of what The Intercept wrote, there is no evidence that Strawhorse was slated for use en masse, or that it even reached an operational phase.

NOTE: At the time these documents were reportedly created, a vast majority of App Store developers were American citizens. Based on the wording of the document, this was still in the middle stages of development, and an injection mechanism (the complicated part) does not appear to have been developed yet, as there was no mention of it.

Continue reading

Superfish Spyware Also Available for iOS and Android

Screen Shot 2015-02-20 at 3.52.37 PMFor those watching the Superfish debacle unfold, you may also be interested to note that Superfish has an app titled LikeThat available for iOS and Android. The app is a visual search tool apparently for finding furniture that you like (whatever). They also have other visual search apps for pets and other idiotic things, all of which seem to be quite popular. Taking a closer look at the application, it appears as though they also do quite a bit of application tracking, including reporting your device’s unique identifier back to an analytics company. They’ve also taken some rather sketchy approaches to how they handle photos so as to potentially preserve the EXIF data in them, which can include your GPS position and other information.

To get started, just taking a quick look at the binary using ‘strings’ can give you some sketchy information. Here are some of the URLs in the binary:

Continue reading

Pawn Storm Fact Check

Fortinet recently published a blog entry analyzing the Pawn Storm malware for iOS. There were some significant inaccuracies, however, and since Fortinet seems to be censoring website comments, I thought I’d post my critique here. Here are a few things important to note about the analysis that were grossly inaccurate.

First of important note is the researcher’s claim that the LSRequiresIPhoneOS property indicates that iPads are not targeted, but that the malware only runs on iPhone. Anyone who understands the iOS environment knows that the LSRequiresIPhoneOS tag simply indicates that the application is an iOS application; this tag can be set to true, and an application can still support iPad and any other iOS based devices (iPod, whatever). I mention this because anyone reading this article may assume that their iPad or iPod is not a potential target, and therefore never check it. If you suspect you could be a target of Pawn Storm, you should check all of your iOS based devices.

Second important thing to note: Most of the information the researcher claims the application gathers can only be gathered on jailbroken devices. This is because the jailbreak process in and of itself compromises Apple’s own sandbox in order to allow applications to continue to run correctly after Cydia has relocated crucial operating system files onto the user data partition. When running Cydia for the first time, several different folders get moved to the /var/stash folder on the user partition. Since this folder normally would not be accessible outside of Apple’s sandbox, the geniuses writing jailbreaks decided to break Apple’s sandbox so that you could run your bootleg versions of Angry Birds. Smart, huh?

Continue reading

What You Need to Know About WireLurker

Mobile Security company Palo Alto Networks has released a new white paper titled WireLurker: A New Era in iOS and OS X Malware. I’ve gone through their findings, and also managed to get a hold of the WireLurker malware to examine it first-hand (thanks to Claud Xiao from Palo Alto Networks, who sent them to me). Here’s the quick and dirty about WireLurker; what you need to know, what it does, what it doesn’t do, and how to protect yourself.

How it Works

WireLurker is a trojan that has reportedly been circulated in a number of Chinese pirated software (warez) distributions. It targets 64-bit Mac OS X machines, as there doesn’t appear to be a 32-bit slice. When the user installs or runs the pirated software, WireLurker waits until it has root, and then gets installed into the operating system as a system daemon. The daemon uses libimobiledevice. It sits and waits for an iOS device to be connected to the desktop, and then abuses the trusted pairing relationship your desktop has with it to read its serial number, phone number, iTunes store identifier, and other identifying information, which it then sends to a remote server. It also attempts to install malicious copies of otherwise benign looking apps onto the device itself. If the device is jailbroken and has afc2 enabled, a much more malicious piece of software gets installed onto the device, which reads and extracts identifying information from your iMessage history, address book, and other files on the device.

WireLurker appears to be most concerned with identifying the device owners, rather than stealing a significant amount of content or performing destructive actions on the device. In other words, WireLurker seems to be targeting the identities of Chinese software pirates.

Continue reading

Preliminary Findings on Whisper

At the suggestion of @kashhill, I did a brief analysis of the Whisper iOS application, which appears to be at the height of controversy with respect to user privacy. My preliminary observations follow. Note, I am only looking at the technical aspects of the application, and make no political conclusions about the motivations of the company. I do not see any horribly underhanded malicious code in the application, although it is a large application and my analysis was brief. In spite of this, the Whisper app does not appear to be a social networking application with analytics; it appears to be an analytics and user acquisition application that also happens to have a social networking component. With this come a few concerns about privacy and anonymity.

Continue reading

Disk Analyzer: Zero Free Space on Your iOS Device

Screen Shot 2014-10-16 at 11.44.05 AM

Interested in the low level statistics of your iOS device’s disk, such as inode consumption and other file system metrics? Disk Analyzer allows you to view and work with your device’s used and free space and partition statistics. This simple little tool provides all the information about your device’s disk in simple, user friendly display. An ideal tool for businesses and enterprises.

In addition to analyzing your disk space, Disk Analyzer provides an advanced tool that can overwrite the free space on your device. Turn on Advanced Options in Settings to activate this feature, and a “Zero Free Space” button will appear in the application.

Now Available! Click Here to view in iTunes

How App Store Apps are Hacked on Non-Jailbroken Phones

(And Why Self-Expiring Messaging Apps Aren’t Trustworthy)

This brief post will show you how hackers are able to download an App Store application, patch the binary, and upload it to a non-jailbroken device using its original App ID, without the device being aware that anything is amiss – this can be done with a $99 developer certificate from Apple and [optionally] an $89 disassembler. Also, with a $299 enterprise enrollment, a modified application can be loaded onto any iOS device, without first registering its UDID (great for black bag jobs and the intelligence community).

Now, it’s been known for quite sometime in the iPhone development community that you can sign application binaries using your own dev certificate. Nobody’s taken the time to write up exactly how people are doing this, so I thought I would explain it. This isn’t considered a security vulnerability, although it could certainly be used to load a malicious copycat application onto someone’s iPhone (with physical access). This is more a byproduct of developer signing rights on a device, after it’s been enabled with a custom developer profile. What this should be is a lesson to developers (such as Snapchat, and others who rely on client-side logic) that the client application cannot be trusted for critical program logic. What does this mean for non-technical readers? In plain English, it means that Snapchat, as well as any other self-expiring messaging app in the App Store, can be hacked (by the recipient) to not expire the photos and messages you send them. This should be a no-brainer, but it seems there is a lot of confusion about this, hence the technical explanation.

As a developer, putting your access control on the client side is taboo. Most developers understand that applications can be “hacked” on jailbroken devices to manipulate the program, but very few realize it can be done on non-jailbroken devices too. There are numerous jailbreak tweaks for unlimited skips in Pandora, to prevent Snapchat messages from expiring, and even to add favorites in your mentions on TweetBot. The ability to hack applications is why (the good) applications do it all server-side. Certain types of apps, however, are designed in such a way that they depend on client logic to enforce access controls. Take Snapchat, for example, whose expiring messages require that the client make photos inaccessible after a certain period of time. These types of applications put the end-user at risk in the sense that they are more likely to send compromising content to a party that they don’t necessarily trust – thinking, at least, that the message has to expire.

Continue reading

Private Photo Vault: Not So Private

One of the most popular App Store applications, Private Photo Vault (Ultimate Photo+Video Manager) claims over 3 million users, and that your photos are “100% private”. The application, however, stores its data files without using any additional protection or encryption than any other files stored on the iPhone. With access to an unlocked device, a pair record from a seized desktop machine, or possibly even just a copy of a desktop or iCloud backup, all of the user’s stored images and video can be recovered and read in cleartext.

Screen Shot 2014-09-29 at 9.08.33 PM


Continue reading

Counter-Forensics: Pair-Lock Your Device with Apple’s Configurator

Last updated for iOS 8 on September 28, 2014

As it turns out, the same mechanism that provided iOS 7 with a potential back door can also be used to help secure your iOS 7 or 8 devices should it ever fall into the wrong hands. This article is a brief how-to on using Apple’s Configurator utility to lock your device down so that no other devices can pair with it, even if you leave your device unlocked, or are compelled into unlocking it yourself with a passcode or a fingerprint. By pair-locking your device, you’re effectively disabling every logical forensics tool on the market by preventing it from talking to your iOS device, at least without first being able to undo this lock with pairing records from your desktop machine. This is a great technique for protecting your device from nosy coworkers, or cops in some states that have started grabbing your call history at traffic stops.

With iOS 8’s new encryption changes, Apple will no longer service law enforcement warrants, meaning these forensics techniques are one of just a few reliable ways to dump forensic data from your device (which often contains deleted records and much more than you see on the screen). Whatever the reason, pair locking will likely leave the person dumbfounded as to why their program doesn’t work, and you can easily just play dumb while trying not to snicker. This is an important step if you are a journalists, diplomat, security researcher, or other type of individual that may be targeted by a hostile foreign government. It also helps protect you legally, so that you don’t have to be put in contempt of court for refusing to turn over your PIN. The best thing about this technique is, unlike my previous technique using pairlock, this one doesn’t require jailbreaking your phone. You can do it right now with that shiny new device.

Continue reading

How to Help Secure Your iPhone From Government Intrusions

There’s been a lot of confusion about Apple’s recent statements in protecting iOS 8 data, supposedly stifling law enforcement’s ability to do their job. FBI boss James Comey has publicly criticized Apple, and essentially blamed them for the next hundred children who get kidnapped. While Apple’s new security improvements have made it a lot harder to get to certain types of data, it’s important to note that there are still a number of techniques that can be employed against iOS 8, with varying levels of success. Most of these are techniques that law enforcement is already doing. Some are part of commercial forensics tools such as Oxygen and Cellebrite. The FBI is undoubtedly aware of them. I’ll outline some of the most common ones here.

I’ve included some tips for those of us who are concerned about data security. Security researchers, journalists, law abiding activists, diplomats, and many other types of high profile individuals should all be practicing good data security, especially when abroad. Foreign governments are just as capable of performing the same forensics techniques that our own government is capable of, and there is an overwhelming amount of information suggesting that all of these classes of individuals have been targeted by foreign governments.

Continue reading

The Politics Behind iPhone Encryption and the FBI

Apple’s new policy about law enforcement is ruffling some feathers with FBI, and has been a point of debate among the rest of us. It has become such because it’s been viewed as just that – a policy – rather than what it really is, which is a design change. With iOS 8, Apple has finally brought their operating system up to what most experts would consider “acceptable security”. My tone here suggests that I’m saying all prior versions of iOS had substandard security – that’s exactly what I’m saying. I’ve been hacking on the phone since they first came out in 2007. Since the iPhone first came out, Apple’s data security has had a dismal track record. Even as recent as iOS 7, Apple’s file system left almost all user data inadequately encrypted (or protected), and often riddled with holes – or even services that dished up your data to anyone who knew how to ask. Today, what you see happening with iOS 8 is a major improvement in security, by employing proper encryption to protect data at rest. Encryption, unlike people, knows no politics. It knows no policy. It doesn’t care if you’re law enforcement, or a criminal. Encryption, when implemented properly, is indiscriminate about who it’s protecting your data from. It just protects it. That is key to security.

Up until iOS 8, Apple’s encryption didn’t adequately protect users because it wasn’t designed properly (in my expert opinion). Apple relied, instead, on the operating system to protect user data, and that allowed law enforcement to force Apple to dump what amounted to almost all of the user data from any device – because it was technically feasible, and there was nobody to stop them from doing it. From iOS 7 and back, the user data stored on the iPhone was not encrypted with a key that was derived from the user’s passcode. Instead, it was protected with a key derived from the device’s hardware… which is as good as having no key at all. Once you booted up any device running iOS 7 or older, much of that user data could be immediately decrypted in memory, allowing Apple to dump it and provide a tidy disk image to the police. Incidentally, it also allowed a number of hackers (including criminals) to read it.

Continue reading

iOS 8 Protection Mode Bug: Some User Files At Risk of Exposure

Apple’s recent security announcement suggested that they no longer have the ability to dump your content from iOS 8 devices:

“On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”

It looks like there are some glitches in this new encryption scheme, however, and some of the files being stored on your iOS 8 device are not getting encrypted in this way. If you copy files over to your device using iTunes’ “File Sharing” feature or sync videos that appear in the “Home Videos” section of iOS, these files are not getting placed under the protection of your passcode. Theoretically, Apple could dump these in Cupertino, if given your locked iPhone.

Continue reading

Your iOS 8 Data is Not Beyond Law Enforcement’s Reach… Yet.

In a recent announcement, Apple stated that they no longer unlock iOS (8) devices for law enforcement.

On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”

This is a significantly pro-privacy (and courageous) posture Apple is taking with their devices, and while about seven years late, is more than welcome. In fact, I am very impressed with Apple’s latest efforts to beef up security all around, including iOS 8 and iCloud’s new 2FA. I believe Tim Cook to be genuine in his commitment to user privacy; perhaps I’m one of the few who can see just how gutsy this move with iOS 8 is.

It’s important to take a minute, however, to note that this does not mean that the police can’t get to your data. What Apple has done here is create for themselves plausible deniability in what they will do for law enforcement. If we take this statement at face value, what has likely happened in iOS 8 is that photos, messages, and other sensitive data, which was previously only encrypted with hardware-based keys, is now being encrypted with keys derived from a PIN or passcode. No doubt this does improve security for everyone, by marrying encryption to the PIN (something they ought to have been doing all along). While it’s technically possible to brute force a PIN code, that doesn’t mean it’s technically feasible, and thus lets Apple off the hook in terms of legal obligation. Add a complex passcode into the mix, and it gets even uglier, having to choose any of a number of dictionary style attacks to get into your encrypted data. By redesigning the file system in this fashion (if this is the case), Apple has afforded themselves the ability to say, “the phone’s data is encrypted with a PIN or passphrase, and so we’re not legally required to hack it for you guys, so go pound sand”. I am quite impressed, Mr. Cook! That took courage… but it does not mean that your data is beyond law enforcement’s reach.

Continue reading

An Open Letter to Tim Cook and Apple’s Security Team


You may not know me, but you probably know my research over the years. I’ve been researching security on Apple devices since 2007, when iPhone first came out, and even helped put together the very first jailbreaks. I’ve assisted law enforcement and military with forensics tools and support on iDevices, and had already started helping to make our world a much better place before Apple even had a law enforcement process. Additionally, I’ve written several books on iPhone ranging from development, to security, to forensics. Throughout my time researching Apple, I’ve found many vulnerabilities that affect the privacy of your customers (including me!), and have presented findings at numerous security and forensics conferences, including Black Hat, Hackers on Planet Earth (HOPE), Mobile Forensics World, Techno Security, HTCIA, and others. Never asked you to feature my books in your store (even when mine were the only iPhone books), never asked for free products, invites to anything, or felt entitled to anything. I love Apple products, and that’s why it’s been a fun experience to tinker with them, and it feels good to know that I’ve played a small, but consistent role in seeing their security improve over time.

You know what’s not fun? When I work very hard on a research paper, go to the trouble of submitting it to a scientific journal, and pay out of my own pocket to travel to a conference to present my findings only to have Apple silently sweep the vulnerabilities I’ve discovered under the rug without ever disclosing their existence, the patches you’ve made, or giving the researcher proper credit in your security release notes. Today, you released your security notes for iOS 8, and guess what wasn’t in them? Almost all of the things you fixed in Beta 5, that came directly from my research paper. Shortly after my research made national news, Apple fixed a number of these serious vulnerabilities that – at best – were the product of horribly sloppy engineering. Not small issues, either, mind you – issues that allowed for persistent, wireless surveillance of iOS devices, wirelessly intercepting packet data, and bypassing the consumer’s backup encryption password to scrape highly sensitive consumer data (including SMS, photo album, geolocation database, and more) from the device using a number of undisclosed services Apple had never told the public even existed and were running on all 600 million consumer devices, in spite of the fact that numerous commercial law enforcement forensics tools were actively exploiting these services to dump highly sensitive content from consumers’ mobile devices.

Continue reading

Is Apple’s new 2FA Really Secure? (Answer: It’s Pretty Solid)

I’ve recently updated my TL;DR regarding the recent celebrity iCloud hacks. I now summarize Apple’s latest changes to improve their 2-factor authentication (2FA) . Apple has implemented not just a band-aid, but a very good security solution to protect iCloud accounts, by completely reinventing their own 2-step validation (sorry, I couldn’t resist). As a result, users who have activated this feature will need to provide a one-time validation code in order to access their iCloud account from a web browser, or to provision iCloud from an iOS device. As my TL;DR suggests, this new technical measure would have prevented the celebrity iCloud hacks. So are Apple’s new techniques really secure, even in light of the very technically un-savvy users who fall victim to iCloud phishing attacks?

While Apple has done their part to improve the security of iCloud, less than savvy users can still screw it up. First of all, by not having the feature turned on in the first place. Apple’s two-step validation process is opt-in, and therefore it’s important to make sure that users know about and understand the benefits to enabling this feature. In my opinion, Apple should force users to have this feature on if they enable Photo Stream or iCloud Backups, as they are likely to keep sensitive content in the cloud without necessarily knowing it.

So you’re more savvy than that. You’ve already activated the new 2FA on your iCloud account. Are you truly safe from future phishing attacks?

Continue reading

Apple Should Have Abandoned NFC and Acquired LoopPay Instead

Is it OK to admit that NFC exists now? Apple’s latest iPhone models now incorporate the near-field communications technology that’s been around in Android phones for a few years… and a little too late, according to many experts. Over a year ago, KPMG ran a story citing NFC had already run its course and was obsolete, lacking widespread adoption in the mobile industry (ironically, they removed this story after the iPhone 6 launch). Companies like PayPal have also tabled the idea of NFC and instead focused on convincing businesses to accept non-POS forms of electronic payments. In the widespread abandonment of NFC, in fact, many new and promising technologies have crept up in its place. Apple’s move to take this dinosaur and incorporate it into their bleeding edge line of products was an antiquated move in light of what they could have done, and the convenience it could have provided to consumers if they had instead looked at alternative technologies.

Continue reading

Apple Addresses iOS Surveillance and Forensics Vulnerabilities

After some preliminary testing, it appears that a number of vulnerabilities reported in my recent research paper and subsequent talk at HOPE/X have been addressed by Apple in iOS 8. The research outlined a number of risks for wireless remote surveillance, deep logical forensics, and other types of potential privacy intrusions fitting certain threat models such as high profile diplomats or celebrities, targeted surveillance, or similar threats.

Given that Apple has dropped the NDA for iOS 8, it appears that I can write freely about the improvements they’ve made to address the vulnerabilities I’ve outlined in my paper. Here’s a summary of what’s been fixed, what risks still remain, and some steps you can take to help protect the data on your device.

Continue reading