Skip to content
  • About Me
  • Books
  • Photography
  • Papers
  • Security
  • Forensics
  • Essays
  • Christianity

Calendar

March 2023
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  
« Feb    

Archives

  • February 2023
  • December 2022
  • November 2022
  • July 2022
  • March 2022
  • January 2022
  • December 2021
  • November 2021
  • September 2021
  • July 2021
  • December 2020
  • November 2020
  • March 2020
  • September 2019
  • August 2019
  • November 2018
  • August 2018
  • March 2018
  • March 2017
  • February 2017
  • January 2017
  • November 2016
  • October 2016
  • July 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • June 2015
  • March 2015
  • February 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • January 2014
  • October 2013
  • September 2013
  • June 2013
  • May 2013
  • April 2013
  • December 2012
  • May 2012
  • September 2011
  • June 2011
  • August 2010
  • July 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • July 2009
  • May 2008
  • March 2008
  • January 2008
  • June 2007
  • August 2006
  • February 2006

Categories

  • Apple
  • Christianity
  • Essays
  • Forensics
  • General
  • Machine Learning
  • Music
  • Opinion
  • Photography
  • Politics
  • Security











ZdziarskiDFIR, security, reverse engineering, photography, theology, funky bass guitar. All opinions are my own.
  • About Me
  • Books
  • Photography
  • Papers
  • Security
  • Forensics
  • Essays
  • Christianity
Apple . Forensics . Security

Apple Confirms “Backdoors”; Downplays Their Severity

On July 23, 2014 by Jonathan Zdziarski

Apple responded to allegations of hidden services running on iOS devices with this knowledge base article. In it, they outlined three of the big services that I outlined in my talk. So again, Apple has, in a traditional sense, admitted to having backdoors on the device specifically for their own use.

A backdoor simply means that it’s an undisclosed mechanism that bypasses some of the front end security to make access easier for whoever it was designed for (OWASP has a great presentation on backdoors, where they are defined like this). It’s an engineering term, not a Hollywood term. In the case of file relay (the biggest undisclosed service I’ve been barking about), backup encryption is being bypassed, as well as basic file system and sandbox permissions, and a separate interface is there to simply copy a number of different classes of files off the device upon request; something that iTunes (and end users) never even touch. In other words, this is completely separate from the normal interfaces on the device that end users talk to through iTunes or even Xcode. Some of the data Apple can get is data the user can’t even get off the device, such as the user’s photo album that’s synced from a desktop, screenshots of the user’s activity, geolocation data, and other privileged personal information that the device even protects from its own users from accessing. This weakens privacy by completely bypassing the end user backup encryption that consumers rely on to protect their data, and also gives the customer a false sense of security, believing their personal data is going to be encrypted if it ever comes off the device.

Perhaps people misunderstand the term “backdoor” due to the stigma Hollywood has given them to absolutely mean a conspiracy with the NSA, but I have never accused these “hidden access methods” as being intended for anything malicious, or for government, and I’ve made repeated statements that I haven’t accused Apple of working with NSA. Sure, that’s possible… but there’s no evidence of that. That doesn’t mean, however that either our government or a foreign government can’t take advantage of backdoors to access the same information. This affects you on your local networks, but especially also when traveling internationally, going through customs, or in any other place where someone may have a privileged position to access your devices. The threat model I used in my paper was a government entity that has been found (via leaked documents) to penetrate a target’s desktop computer and then use that privileged position to penetrate the target’s iPhone, by enabling 38 features that very closely resemble those of file relay. This all happened long before iOS 7’s trust dialog, when you could give any hardware you plugged into full access to your device simply by plugging it in, whether it be a malicious charger, alarm clock or anything else. Other threat models that work here are people that you know (but don’t necessarily trust), or people that you don’t know who may target you. The threat model that everyone is paranoid about is the “complete stranger” scenario, stealing your locked device. This model does NOT work here, and is one of the big reasons I’ve been telling people not to panic from the get-go. Still, the others are quite plausible. The infosec community has maintained a level-headed, but serious posture with regards to this research; the only ones discounting it really are people approaching this from an IT perspective, and don’t understand threats, persistence, and the underlying technology we’re talking about. What does concern me the most is that Apple appears to be completely misleading about some of these (especially file relay), and not addressing the issues I raised on others.

Lets start with pcapd; I mentioned in my talk that pcapd has many legitimate uses such as these, and I have no qualms with Apple using pcapd to troubleshoot issues on users’ devices. Using a packet capture has been documented for developers for a couple of years, but had no explanation for being on every device that wasn’t in developer mode. The problem I have with its implementation, however. In iOS, pcapd is available on every iOS device out there, and can be activated on any device without the user’s knowledge.  You also don’t have to be enrolled in an enterprise policy, and you don’t have to be in developer mode.  What makes this service dangerous is that it can be activated wirelessly, and does not ask the user for permission to activate it… so it can be employed for snooping by third parties in a privileged position.

Now lets talk about file relay. If, by diagnostic data, you mean the user’s complete photo album, their SMS, Notes, Address Book, GeoLocation data, screenshots of the last thing they were looking at, and a ton of other personal data – then sure… but this data is far too personal in nature to ever justify diagnostics. In fact, diagnostics is almost the complete opposite of this kind of data. You will find some diagnostics data in the mix somewhere, but this service goes way beyond the data Apple has a need or a right to look through. And once again, the user is never prompted to give their permission to dump all of this data, or notified in any way on-screen. Apple insists AppleCare gets your consent, but this must be a verbal consent, as it is certainly not a technological consent. What’s more, since this service really is just for diagnostic use, you’d think that it would respect backup encryption, so that everything coming off the phone is encrypted with the user’s backup password. When I take my laptop to Apple for repairs, I have to provide the password. But Apple seems to have admitted to the mechanics behind file relay, which skip around backup encryption, to get to much the same data. In addition to this, it can be dumped wirelessly, without the user’s knowledge. So why does this need to be the case? It doesn’t. File relay is far too sloppy with personal data, and serves up a lot more than “diagnostics” data.
Lastly, house arrest. I make no qualms with this either, and in fact iTunes and Xcode do use this service to access the documents inside a user’s sandbox, as I mentioned in my talk. As I mentioned in my talk also,  however, it can also be used to access the stateful information on the device that should never come off the phone – Library, Caches, Preferences, etc. This is where most of the personal data from every application is stored, including OAuth tokens (which is just as good as having the password to your accounts), private conversations, friends lists, and other highly personal data. The interface is wide open to access all of this – far beyond just the “Documents” folder that iTunes needs to access new Pages files. This is not a backdoor, rather a privileged access that’s available here that really doesn’t need to be there (or at least could be engineered differently).

Now consider data protection. The pairing record that is used to access all of this data is sent an escrow bag, which contains a backup copy of your key bag keys for unlocking data protection encryption. So again, we’re back to the fact that with any valid pairing, you have access to all of this personal data – whether it was Apple’s intention or not, and many adversaries are taking advantage of this.

Now I hear the argument pop up from a few people who don’t understand how all of this works that, “of course you can dump personal info after you’ve paired, it’s supposed to sync your data”. Well, no. The trust dialog (the only pairing security there is) was only an afterthought that got added last year after another researcher showed how easily you could hijack iOS 6 by simply plugging it into a malicious charger. In fact, Apple added backup encryption to iOS specifically because they realized people’s devices were pairing with a bunch of hardware that the user didn’t trust. If pairing were meant to be a means to security, there would be no need for backup encryption at all. Your device certainly shouldn’t be making your personal information available wirelessly to any device that you’ve ever paired with since you last wiped your phone (or anyone who compromises one of those machines); in fact many users DON’T trust their work computer or other computers at all, and use backup encryption to PREVENT their personal data from being exposed. Never mind how easy it is to see someone enter their PIN, or leave your phone lying around.

Any good security expert will tell you that good authentication is two factor; something you have and something you know. Pairing records alone are only half of the security. The backup password is the other half (something you know), so when you bypass it, you’ve broken security. These mechanisms completely bypass this encryption and there in lies the problem. Without backup encryption, your entire life’s worth of personal data relies solely on the ability to either steal a pairing record, or even worse – on that four digit PIN that people watch you type in day in and day out. And if you’re smart enough to use a complex paraphrase, you probably also have set the phone not to lock for at least 5 or 15 minutes, to make your phone usable. No, Apple knows that good security is two factor, and that’s why you have backup encryption.

In addition to downplaying the services themselves, Apple has stated that the user must “explicitly grant consent” for these services to be used. This is not the case. The user has had no idea these services even exist at all on the device until recently. There is no dialog asking the user to allow the packet sniffer to run, or to access your photos/contacts/sms/etc to provide to AppleCare (the dialogs you’re used to seeing third party apps present are not presented when these services are accessed). This consent simply doesn’t exist. The only consent is pushing that “trust” button, which (unbeknownst to the user) gives complete carte blanche access to the mobile device, wirelessly, indefinitely, and bypassing the backup encryption that the user believes is protecting their data from unwanted eyes. Many people just push the trust button because they’re sick of getting nagged by stupid hardware that has no business asking for trust. Customers don’t necessarily understand that you shouldn’t have to push “trust” to charge your phone. This allows juice jacking almost as easily as before iOS 7. As I’ve written, also, even without pushing trust, the pairing relationship can be abused in a number of ways, either by the very simple process of copying one file from a trusted machine, watching a user type their PIN in behind their back (or from a security camera overhead), dusting the phone for the four dirtiest parts of the screen, or even just waiting for someone to leave their device lying around unlocked for a few seconds. At this point, I can seize the device and easily pair with it myself, and dump everything – even if the user has a backup password. This shouldn’t be the case.

I give Apple credit for acknowledging these services, and at least trying to give an answer to people who want to know why these services are there – prior to this, there was no documentation about file relay whatsoever, or its 44 data services to copy off personal data. They appear to be underestimating its capabilities, however, in downplaying them, and this concerns me. I wonder if the higher ups at Apple really are aware of how much non-diagnostic personal information it copies out, wirelessly, bypassing backup encryption. All the while, I suspect they’ll also quietly fix many of the issues I’ve raised in future versions. At least I hope so. It would be wildly irresponsible for Apple not to address these issues, especially now that the public knows about them.
Lastly, please remember my talk was titled “Identifying Backdoors, Attack Points, and Surveillance Mechanisms in iOS Devices”, AND NOT “Apple Conspires with NSA for Backdoors”. I have outlined some services I believe are backdoors by definition (such as file relay), and Apple has all but confirmed this by stating that their purpose is for Apple to access your data (thank you, Apple, for acknowledging that). I have also outlined many things in my talk that are not back doors, but are attack points and (enterprise) surveillance mechanisms that could be taken advantage of. The pcapd and house arrest services certainly make tasty attack points for an attacker, and should be fixed to limit their risk. Back doors aren’t secrets, but they can be dangerous if misused. As I’ve stated before, DON’T PANIC. I have never suggested this was a conspiracy. As usual, the media has completely derailed the intention of my talk.

Archives

  • February 2023
  • December 2022
  • November 2022
  • July 2022
  • March 2022
  • January 2022
  • December 2021
  • November 2021
  • September 2021
  • July 2021
  • December 2020
  • November 2020
  • March 2020
  • September 2019
  • August 2019
  • November 2018
  • August 2018
  • March 2018
  • March 2017
  • February 2017
  • January 2017
  • November 2016
  • October 2016
  • July 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • June 2015
  • March 2015
  • February 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • January 2014
  • October 2013
  • September 2013
  • June 2013
  • May 2013
  • April 2013
  • December 2012
  • May 2012
  • September 2011
  • June 2011
  • August 2010
  • July 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • July 2009
  • May 2008
  • March 2008
  • January 2008
  • June 2007
  • August 2006
  • February 2006

Calendar

March 2023
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  
« Feb    

Categories

  • Apple
  • Christianity
  • Essays
  • Forensics
  • General
  • Machine Learning
  • Music
  • Opinion
  • Photography
  • Politics
  • Security

All Content Copyright (c) 2000-2022 by Jonathan Zdziarski, All Rights Reserved