M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 | 29 |
30 | 31 |
With iOS 7 and the new 5s come a few new security mechanisms, including a snazzy fingerprint reader and a built-in “trust” mechanism to help prevent juice jacking. Most people aren’t aware, however, that with so much new consumer security also come new bypasses in order to give enterprises access to corporate devices. These are in your phone’s firmware, whether it’s company owned or not, and their security mechanisms are likely also within the reach of others, such as government agencies or malicious hackers. One particular bypass appears to bypass both the passcode lock screen as well as the fingerprint locking mechanism, to grant enterprises access to their devices while locked. But at what cost to the overall security of consumer devices?
While Apple showed off their new fingerprint reader publicly, the significance of the counter-juice jacking mechanism has gone unnoticed. This new security mechanism has been long overdue, and simply pops up a window requiring the user to trust the host it’s connect to before it’s allowed to pair with it. This is a good step forward in terms of pairing security, and ensures that any computer attempting to establish a trusted relationship with the device has to be explicitly authorized by the user. The long term effectiveness of this, however, is questionable, as there have been recent reports of people’s car chargers requiring that you push “Trust” before it will charge… training millions of iPhone users to mindlessly push “Trust” for anything.
Why is this important? In order to understand, you first have to understand pairing2. A pairing is a trusted relationship with another device, where a computer is granted privileged, trusted access on the iPhone. In order to have the level of control to download personal data, install applications, or perform other such tasks on an iOS device, the machine it’s connected to must be paired with the device. This is done through a very simple procedure, where the desktop and the phone create and exchange a set of keys and certificates. Once paired, these keys remain stored on the device indefinitely, until you perform a restore or wipe the phone some other way. A pairing record is like a skeleton key to an iOS device. With it, someone with the right know-how (or even some good open source tools) can download all of your personal data2, 6, 7, install invisible applications (onto your non-jailbroken phone) that run in the background3, 6, 8, activate the device’s built-in packet sniffer to monitor your network traffic4, hijack the APN to route all cellular traffic through a proxy5, access and download any personal data from any application’s sandbox6, 2, and do all of this either across USB or over WiFi, and without any visual indications to the user. Much of this can also be done while the device is locked, regardless of whether you’re using a fingerprint reader or not, as long as you have a pairing record. Data can also be acquired from the phone regardless of whether backup encryption is turned on or not.
This kind of access to an iPhone is no doubt within the crosshairs of nosy government agencies, as well. German news outlet Der Spiegel ran an article1 this month, citing leaked NSA documents that boasted of the agency’s capabilities in hacking iPhones as early on as 2009. As the article describes it, the NSA allegedly hacks into the desktop machine of their subjects and then runs additional “scripts” that allow them to access a number of additional “features” running on the subjects’ iPhones; these are likely a number of these hidden services running on the device that most consumers aren’t aware of, such as AFC, House Arrest, File Relay, and PCAP, among others2, 6, 7. From the article:
“The documents state that it is possible for the NSA to tap most sensitive data held on these smart phones, including contact lists, SMS traffic, notes and location information about where a user has been. … In the internal documents, experts boast about successful access to iPhone data in instances where the NSA is able to infiltrate the computer a person uses to sync their iPhone. Mini-programs, so-called “scripts,” then enable additional access to at least 38 iPhone features.”
In iOS 6 and lower, anyone who’s computer you’ve ever connected to your phone likely saved such a pairing record, and gained indefinite access to do these things with your phone. Your own desktop computer also saves a copy of this pairing record so that iTunes can talk to your phone to sync with the device, install applications, and so on. With Apple’s new trust mechanism, plugging an iOS device into someone’s computer (a malicious charger, alarm clock, or other device), will now display a confirmation screen requiring the user to first trust the machine before granting this privileged access. If you tell it no, then no soup for you!
But the catch is that too much security is bad – for enterprises, at least. With new features such fingerprint-based locking mechanisms for the 5s, it’s becoming exceedingly difficult to simply “turn over a password” to your employer, and so Apple clearly had to find a way to bypass these locking mechanisms so that uncooperative employees couldn’t prevent a business from accessing data on corporately owned devices. With iOS 7 also came what appears to be a bypass to the device locking and pairing security mechanism, which overrides the device’s passcode / fingerprint lock and user trust checks completely, allowing the device to be paired, synced, and possibly screen-unlocked. This allows the device to be paired both without a user trust prompt, and also while locked with a passcode or a fingerprint. While this feature remains undocumented, it is necessary in order for Apple’s MDM to acquire data from employee-owned iOS devices. It is likely that the purpose of this is to handle special circumstances, so that an employee’s data can be dumped after they leave the company, or if they’re incapacitated, or unwilling to provide their fingerprint or password to unlock the device. It’s also possible that this feature won’t be available to all enterprises, or may not be so “in your face” obvious, but rather integrated at a lower level. Nevertheless, the ability to establish a pairing while locked opens wide the privileges available to an MDM administrator, regardless of what controls may be in the GUI. Such pairing record data could also be used to perform forensic recovery of the device with commercial software, or perform other more nefarious tasks using open source tools.
Enrolling a device into an MDM profile can apparently happen straight out of the box now. Apple’s new over-the-air (OTA) supervision and automatic enrollment for iOS 7’s MDM9 would appear to allow enterprise or government-owned devices to be configured out-of-the-box with a set of restrictions upon activation. If Apple maintains a database of unique hardware identifiers for its larger enterprises, a device could automatically enroll with Apple’s servers every time it’s activated. Additionally, employees bringing their own devices into their enterprise may enroll in their MDM profile, exposing this security bypass mechanism to their own devices.
So to summarize, there appear to be two ways to apply this kind of configuration to an iOS 7 device: through enrolling the device with an enterprise MDM (using an existing paired connection), or over-the-air through Apple’s servers, when the phone is activated. Additional mechanisms may exist, but have not been discovered.
This security bypass is tied to the Managed Configuration (MC) portions of the operating system, which touch mobile device management (MDM) for an enterprise, but with Apple’s new OTA enrollment, this appears to also be under the control of Apple. The actual settings for this are stored in a class on the phone named MCCloudConfiguration. The managed configuration framework includes a daemon named teslad, which has direct hooks into Apple’s servers to load managed configuration data (the configuration containing – among other restrictions – the pairing security bypass). When the phone is first set up, the setup program calls the daemon, which in turn downloads a cloud configuration certificate from https://iprofiles.apple.com/resource/certificate.cer, and performs a number of different tasks to authenticate and load a configuration from a service named Absinthe, hosted on Apple servers. If you didn’t catch the irony, Absinthe is also the name of a jailbreak for iOS.
It’s worth noting that leaked documents have already shown the NSA’s capabilities to forge certificates and effectively break this kind of encryption10. A successful MiTM combined with a certificate forgery are both well within the reach of agencies like NSA, potentially compromising this entire system. Code already exists publicly to emulate an Apple policy server, which could be used to change the device’s policy11. This may not even be necessary, though, as Apple may have added a backdoor around SSL, through a suspicious switch. The subroutine that validates the SSL session with Apple’s servers first checks for a configuration directive named MCCloudConfigAcceptAnyHTTPSCertificate, and if set, will automatically bypass the SSL validation check, allowing any fake server to masquerade as Apple’s Absinthe server. This could be left over code from debugging, but it could still potentially be taken advantage of.
The teslad daemon has the ability to centrally load a managed configuration onto a device from Apple’s servers. Apple’s Setup program attempts to set up a cloud configuration when the device is first activated. This can also be loaded in manually by enrolling in an enterprise’s MDM policy. Once a configuration is installed, a check-in mechanism is invoked via APNS (Apple Push Notification Service) to apply new management changes11. The profile can later be updated remotely through a mechanism that pulls a new cloud configuration from a URL. HTTP based managed configuration makes for a delicious attack surface for NSA, or for your local neighborhood hacker.
Based on this, if I were to make some educated guesses about how this could be exploited, I would posit that an agency like NSA (or anyone capable of performing certificate forgery) could attack a device upon activation, or any time after it’s been enrolled in an MDM policy whenever it checks in, to change the policy so that it can be paired while locked. With access to a compromised desktop (either through malware, or as Der Spiegel described, NSA targeting you), a malicious attacker could load their own MDM configuration to enroll the device, or they could just steal the pairing record from your desktop, to access your device wirelessly while locked.
Interestingly enough, iOS 7 also includes an internal mechanism to reset the pairing data for the entire phone, so that all trusted relationships are erased, allowing the device’s owner to re-establish the security of the device. Unfortunately, this mechanism can only be triggered on the device itself, and it doesn’t look like anything is yet wired up to it.
While it is likely that Apple has added this feature exclusively for legitimate use by enterprises, even this has some serious implications: with today’s BYOD culture, employees may be unknowingly allowing their personally-owned devices to be forensically accessible to a company’s internal investigations team (as well as law enforcement, with the enterprise’s consent) by simply enrolling it into the corporate MDM policy. Additionally, new employees that are issued devices may be permitted to retain personal information on their corporate device without first being informed that their devices could, at any time, be subject to a thorough search that bypasses security.
It seems as though iOS 7’s new MDM was designed to make all of the mom-and-pop shop “corporate mobile protection” solutions obsolete.
In addition to potential abuse by the enterprise, an agency seeking to commit espionage could set up their own MDM profile and enroll the device from a compromised desktop machine, using the device pairing from that machine. Such feats appear to be what the NSA has been up to already. The advantage of this would be to take advantage of an otherwise short-term connection with the desktop to enroll the device itself into an MDM.
It is speculative, however worth mentioning, that law enforcement agencies could potentially be given access to this mechanism either by Apple or the enterprise, if they had knowledge of the subject device. This would require participation, of course, from one of those parties, but the technical capabilities appear to make this possible. While I am merely speculating at this point, given Apple’s recent patent filing to allow restrictions to be wirelessly pushed to devices in secure government facilities12, it’s conceivable that such restrictions overlap with the same managed configuration interfaces. If Apple has developed the capability to push a camera restriction to devices, then it is also possible that they may have developed the capability to push security bypasses as well, for purposes such as InfoSec enforcement at military installations, or under subpoena. Of course, civilians will never likely have access to such features, if they exist, which is why we must continue to look for them in the code running on our personal devices.
Given recent articles of Apple deluged by requests to image mobile devices for law enforcement12, providing limited law enforcement access to such a bypass could be beneficial for Apple, by providing a mechanism to remotely unlock a device for a specific purpose, where it can be forensically acquired by existing commercial (or internal) tools. The benefit for Apple to do this would be to lighten the load and cost involved with manually processing subpoenas for data acquisition, to which Apple has reportedly been months behind12. Again, this is entirely speculative, it would not surprise me in the least, especially given how “persuasive” our federal government can be over private industry.
The lockdownd process is responsible for performing all pairing and authentication of new connections to an iOS device, before allowing new services to be spawned2. Think of it as an authenticated inetd. Previous versions of iOS would deny pairing of locked devices with the error PasswordProtected. Two new branches have been added to iOS 7’s code, however, to bypass this lock check, and also bypass asking the user to trust the machine.
When a new device attempts to pair, just before the device is tested to see if it’s locked, a check is made to Apple’s MDM through a call to the MCProfileConnection class’s hostMayPairWithOptions method. This check results in one of four possible actions taken, depending on the MDM policy:
This one check provides results for three different tests. These three variables indicate whether pairing is allowed at all, whether pairing security should be completely bypassed, and whether pairing should require a challenge/response. If the MDM is set up to allow a lock and trust bypass, the block of code that performs these checks is completely skipped over. Both the user trust prompt and the device lock test are bypassed, allowing the device to continue pairing, even if a passcode is set. It’s actually the same bypass that the device makes if screen-lock security isn’t supported (by devices with no SpringBoard user interface). The logic, in pseudocode, works this way:
if (allow_pairing == false) /* MDM prevents all pairing */ { error(PasswordProtected); } if (allow_pairing_while_locked || device_has_no_springboard_gui) { goto skip_device_lock_and_trust_checks; /* Skip security */ } /* Pairing Security */ if (device_is_locked == true) { if (setup_has_completed) { if (user_never_pushed_trust) { error(PasswordProtected); } } } /* Bypass ... */ skip_device_lock_and_trust_checks: ... pairing process continues (validate host challenge, etc)
At it’s very best, the device security bypass is an undocumented MDM feature allowing enterprises to access any enrolled (or over-the-air enrolled) iOS MDM device. Even this, however, creates a significant threat to the security of the many iOS users working for companies with a BYOD policy. Because there is not yet a jailbreak available for iOS 7, actually engaging this bypass to play with it isn’t likely going to happen for a while. Figuring out how to load this setting into MDM is also something that will take some time.
Apple would do well to begin separating consumer firmware from enterprise firmware, to offer a hardened version of its operating system to consumers. This (and other enterprise bypasses) introduced into iOS over the years threaten to weaken the overall security of the device for the majority of consumers (who never enroll in an enterprise environment). While Apple’s management software could very well have sufficient access controls, the underlying mechanisms in these allow for a much broader range of capabilities to those with the right tools or knowledge. That crowd is much larger than you’d expect, given the amount of commercial forensics software and open source tools available.
It is possible to patch this out of iOS, but again this requires jailbreaking. When jailbreaks become available for iOS 7, there may be hope in providing better consumer security against such bypasses from being taken advantage of. In the meantime, employees should be aware that enrolling your personal device into your corporate MDM policy could potentially grant your employer the ability to bypass the security of the device, and view your personal data. If you’re concerned about privacy from a your government (or a foreign government), you may also open your device up to a potential security threat if you are ever targeted.
~
1 Privacy Scandal: NSA Can Spy on Smart Phone Data; Der Spiegel; September 7; 2013; http://www.spiegel.de/international/world/privacy-scandal-nsa-can-spy-on-smart-phone-data-a-920971.html
2 A cross-platform software library and tools to communicate with iOS devices natively; http://www.libimobiledevice.org
3 Mactans: Injecting Malware into iOS Devices via Malicious Chargers; Billy Lau, Yeongjim Jang, Chengyu Song, Tielei Wang, Pak ho Chung, Paul Royal; Georgia Institute of Technology; Black Hat 2013; https://media.blackhat.com/us-13/US-13-Lau-Mactans-Injecting-Malware-into-iOS-Devices-via-Malicious-Chargers-WP.pdf
4 Practical iOS Apps Hacking; http://reverse.put.as/wp-content/uploads/2011/06/GreHack-2012-paper-Mathieu_Renard_-_Practical_iOS_Apps_hacking.pdf
5 Hacking and Securing iOS Applications; http://my.safaribooksonline.com/book/-/9781449325213/hijacking-traffic/apn_hijacking
6 Hacking Apple Accessories to Pown iDevices; Mathieu Renard; Sogeti; http://www.ossir.org/paris/supports/2013/2013-07-09/ipown-redux.pdf
7 Hacking iOS Applications; http://archive.hack.lu/2012/Mathieu%20RENARD%20-%20Hack.lu%20-%20Hacking%20iOS%20Applications%20v1.0%20Slides.pdf
8 iDeviceInstaller; https://github.com/libimobiledevice/ideviceinstaller
9 iOS 7 and Business; http://www.apple.com/ios/business/
10 How US and UK Spy Agencies Defeat Internet Encryption; http://www.theguardian.com/world/2013/sep/05/nsa-gchq-encryption-codes-security
11 The iOS MDM Protocol; http://media.blackhat.com/bh-us-11/Schuetz/BH_US_11_Schuetz_InsideAppleMDM_WP.pdf
12 Apparatus and methods for enforcement of policies upon a wireless device; http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=/netahtml/PTO/search-adv.htm&r=36&p=1&f=G&l=50&d=PTXT&S1=(20120828.PD.+AND+Apple.ASNM.)&OS=ISD/20120828+AND+AN/Apple&RS=(ISD/20120828+AND+AN/Apple)
13 Apple deluged by police demands to decrypt iPhones; May 2013; http://news.cnet.com/8301-13578_3-57583843-38/apple-deluged-by-police-demands-to-decrypt-iphones/
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 | 29 |
30 | 31 |