Mobile Security company Palo Alto Networks has released a new white paper titled WireLurker: A New Era in iOS and OS X Malware. I’ve gone through their findings, and also managed to get a hold of the WireLurker malware to examine it first-hand (thanks to Claud Xiao from Palo Alto Networks, who sent them to me). Here’s the quick and dirty about WireLurker; what you need to know, what it does, what it doesn’t do, and how to protect yourself.
How it Works
WireLurker is a trojan that has reportedly been circulated in a number of Chinese pirated software (warez) distributions. It targets 64-bit Mac OS X machines, as there doesn’t appear to be a 32-bit slice. When the user installs or runs the pirated software, WireLurker waits until it has root, and then gets installed into the operating system as a system daemon. The daemon uses libimobiledevice. It sits and waits for an iOS device to be connected to the desktop, and then abuses the trusted pairing relationship your desktop has with it to read its serial number, phone number, iTunes store identifier, and other identifying information, which it then sends to a remote server. It also attempts to install malicious copies of otherwise benign looking apps onto the device itself. If the device is jailbroken and has afc2 enabled, a much more malicious piece of software gets installed onto the device, which reads and extracts identifying information from your iMessage history, address book, and other files on the device.
WireLurker appears to be most concerned with identifying the device owners, rather than stealing a significant amount of content or performing destructive actions on the device. In other words, WireLurker seems to be targeting the identities of Chinese software pirates.
WireLurker uses enterprise provisioning to install software on non-jailbroken iOS devices, but this is only one piece of it. Apple can revoke the enterprise certificate to prevent installation on iOS 8 devices, however WireLurker can still read information from the device without it. This is because the information is queried by the Mac desktop when your iPhone is plugged into it, by abusing that trusted relationship. Also, if you have a jailbroken iPhone running afc2 (a terribly insecure service allowing root file system access to the device), then a mobile substrate library is copied onto the device to infect the system. This is done regardless of whether or not WireLurker still has a valid enterprise profile.
Much of the content is downloaded over the network, so it’s conceivable that if Apple revokes one certificate, additional certificates could be substituted and new copies of the software inserted.
Practical steps the user can take are to ensure that your device is not running afc2 (and subsequently, not jailbroken, as this weakens your device’s overall security), run Palo Alto Networks’ WireLurker Detector, and also install Little Snitch on your desktop so that you can identify any rogue outgoing connections. If you suspect you may have a malicious application installed by WireLurker on your iOS device, you can also look for an enterprise provisioning profile in Settings > General > Profiles. Of course, it’s also a good idea not to install pirated software, which WireLurker appears to travel with. Once installed on your device, it’s much more difficult to identify or control. The mobile substrate library used to infect jailbroken devices is copied to /var/mobile/Library/MobileSubstrate/DynamicLibraries/sfbase.dylib on the device (or similar).
While your own Mac may not be infected with WireLurker, it’s possible others (in your school, college, at work, or public computers) are, so it’s important not to trust any devices other than your own. To help prevent this from accidentally happening, you may wish to pair lock your device using these instructions.
The bigger issue here is not WireLurker itself; WireLurker appears to be in its infancy, and is mostly a collection of scripts, property lists, and binaries all duct-taped together on the desktop, making it easy to detect. The real issue is that the design of iOS’ pairing mechanism allows for more sophisticated variants of this approach to easily be weaponized. I’ve previously written about how non-jailbroken devices can be infected with modified binaries, due to Apple’s lack of codesign pinning. I’ve also discussed at length about how malicious software can abuse the pairing records of a desktop machine to install malware on an iOS device. While WireLurker appears fairly amateur, an NSA or a GCHQ, or any other sophisticated attacker could easily incorporate a much more effective (and dangerous) attack like this.
A Call for Design Changes
What can Apple do to help prevent it? have a number of ideas ranging from easy fixes to difficult design changes.
On the easy side:
1. Have the phones do a better job of prompting the user before installing applications.
User education is the biggest problem, and Apple has a poor reputation for helping their users make smarter decisions about security. Apps that aren’t signed by Apple (e.g. enterprise apps) only provide a simple OK prompt, and don’t warn the user that the app is not signed / authorized by the App Store. A more thorough explanation to the user about what they are about to click OK to will likely help prevent a number of users from allowing the software to install on iOS in the first place.
2. Disable “Enterprise” app installation entirely without an “Enterprise Mode”
A vast majority of non-enterprise users will never need a single enterprise app installed, and any attempt to do so should fail. So why doesn’t Apple lock this capability out unless it’s explicitly enabled? This could look like a switch in settings, or even just prompting the user so that they understand they’re accepting applications that are not sanctioned by the app store. Again, there is a user education component here, but also just like a “developer mode”, it would make sense for non-supervised devices to have an “enterprise mode” or “ad-hoc mode” that has to be explicitly enabled before any third party software can be installed.
3. Access Controls and/or Encryption for Pair Records
I stated this in my iOS Backdoors / Attack Points paper (http://www.zdziarski.com/blog/wp-content/uploads/2014/08/Zdziarski-iOS-DI-2014.pdf) as a point of attack: pairing records are not protected in any way on the desktop, and are world-readable. Any application on a user’s desktop can access the pairing record and have completely trusted privileges with any iOS devices either connected directly by USB, or also via WiFi. Thankfully, Apple finally closed off a number of ways to dump data wirelessly from iOS 8, however the desktop threats demonstrated with WireLurker still exist.
Apple should manage access to “Trusted Pairing Relationships” with devices the same way it manages access permissions for contacts and geolocation. An application should have to ask for permission to access this privileged data. This would allow iTunes and Xcode to use it, but not others. This can be done with a broker daemon, encryption, and the keychain. If the pair records were encrypted with a key stored in the keychain (or better yet, with the user’s backup password), then even applications obtaining root access would still need the user’s authorization.
Alternatively, this could be managed in a way that locks out any third party application from piggybacking on these trusted relationships. The pair record could be encrypted with a unique key created by iTunes / Xcode, and stored on the keychain. This would prevent third party tools like libimobiledevice from being able to access content on the device at all (at least without having to go to the device directly to pair, which would alert the user with another trust dialog). This, too, could even be managed with a port lock.
More difficult side:
1. While WireLurker actually changed the bundle identifier for their iOS-side malware, the wildcard stayed the same. A better version of this attack would use the same identifier as the existing apps; I’ve demonstrated this in the article mentioned earlier on my website, explaining how app store apps are being hacked on non-jb phones. Apple currently has no mechanism to pin a bundle id (e.g. com.facebook.*) to a specific developer cert. This means that anyone can delete the signature from an app, modify the app’s binary, then re-sign it with their own certificate, and it’ll run on the phone as if it were the original application, complete with APNS, upgrade support, access to all files and configuration data in the sandbox, and so on. Apple could add codesign pinning to developer certs by including bundle wildcards or identifiers in the cert, then enforce this in the OS, forcing it to match the bundle identifier of the app. This would pin the bundle identifier so that it has to be signed with a specific entity’s cert in order to match. Applications, too, can be coded in such a way to transmit the bundle identifier to the server, so that the server can match it, to provide some (but not complete) additional security.
2. In addition to codesign pinning, Apple could come up with a mechanism (perhaps through entitlements) to have the operating system enforce access to specific hostnames only by specific bundle identifiers. For example, allowing privateapi.facebook.com network access only to bundle identifiers com.facebook.messenger, or what have you. This wouldn’t directly affect siphoning data off to remote servers, but it would cause masquerading applications to suspiciously fail to connect to protected hosts.
There are many other ideas rolling around in my head, but those are the first that come to mind.
It would greatly behoove Apple to address this situation with more than a certificate revocation; I’m not scared of WireLurker, but I am concerned that this technique could be weaponized in the future, and be a viable means of attack on public and private sector machines. It could easily be attached to any software download in-transit across non-encrypted HTTP, such as an Adobe Flash download or other software download. Social engineering would also help to make juicy targets out of people likely to click on links from IT departments or install software on their Mac. There are a number of potentially more dangerous uses for WireLurker, and unfortunately many of them will go unnoticed by Apple in time to revoke a certificate. It would be a much better solution to address the underlying design issues that make this possible.