Recovering Photos From Bad Storage Cards

A Guide for Photogaphers, Not Geeks

Most photographers have had at least one heart attack moment when they realize all of the photos they’ve taken on a shoot (or a vacation) are suddenly gone, and there’s nothing on the camera’s storage card. Perhaps you’ve accidentally formatted the wrong card, or the card just somehow got damaged. If you’re a professional photographer, there’s a good chance your’e also not a forensic scientist or a hard-core nerd (although it’s OK to be all three!). That minor detail doesn’t mean, however, that you can’t learn to carve data off of a bad storage card and save yourself a lot of money on data recovery. While there are many aspects to forensic science that are extremely complicated, data carving isn’t one of them, and I’ll even walk you through how to do it on your Mac in this article, with a little bit of open source software and a few commands. If you’re scared of your computer, don’t worry. This is all very easy even though it looks a bit intimidating at first. You can test your skills using any old storage card you might have on hand. It doesn’t have to be damaged, although you might be surprised just how much data you thought was deleted from it!

First, lets talk about how your storage card works. When you plug your storage card into your computer, your computer looks for a list of files on the card; this is kind of like a rolodex of all the files your camera has stored. This “catalog” basically says, “OK, this file is this big, and it starts here”. You can think of it like the table of contents of a book. When you format a storage card, most of the time it’s just this table of contents that gets deleted; the actual bits and bytes from the photo you took aren’t erased (because that would take too long). The same can be true when the file system becomes damaged; in most cases, it’s just the file listing that gets blown up somehow, making it appear like there are no files on the card. In more extreme cases, physical damage can sometimes damage the data from one part of the card, but the data for the other half of the card can still be recovered; your computer needs to be told to look past all the damaged data, instead of just giving you an error message.

Continue reading “Recovering Photos From Bad Storage Cards”

Journal Paper Published

The International Journal of Digital Forensics and Incident Response has formally accepted and published my paper titled Identifying Back Doors, Attack Points, and Surveillance Mechanisms in iOS Devices. This paper is a compendium of services and mechanisms used by many law enforcement agencies and in open source, of modern forensic techniques to create a forensic copy of personal data from iOS devices. It also covers existing mechanisms designed for enterprises, such as fingerprint and PIN bypasses, which could potentially be misappropriated for surveillance of a target, if provided with the right technical capabilities.

Note: I don’t make any money off of this paper; the journal itself charges for copies of all of its papers, but unless they haven’t told me something, the author doesn’t make any commission. I wrote this strictly for the endeavor of science and knowledge.

The Fan Club Effect

I’ve known for a long time that fan clubs affect my selection of a particular product or technology, and have been trying to articulate just how they affect the thought process involved in selection. My recent experience in the world of photography has helped me work through that enough to write about.

I generally remain neutral about the technologies I get involved with, as I believe each technology has it’s own place and purpose. I learned this holds true in computer languages, operating systems, and nearly everything else in life. It is interesting, though, to watch the fan clubs of all camps and the impact they have on neutrality and public opinion. In many cases, it actually works against many manufacturers to have such zealous fans. This too holds true of all things, ranging from computer languages to cameras.

Continue reading “The Fan Club Effect”

Why Not To Shoot at f/22 Anymore

Every “professional” photography book I’ve read makes it gospel that you have to shoot landscapes at f/22, in order to ensure that the foreground and background is in focus. Special thanks to these guys for teaching millions of photographers to create blurry photos. Lens Diffraction, and an explanation as to why shooting at f/22 (and higher than around f/11) is likely giving you soft photos, when sharpness is what you’re trying to achieve.

Check out http://fstoppers.com/what-is-lens-diffraction-on-dslr-camera for a more in-depth explanation. Most of my shots are around f/5.6 – f/11 these days, and have turned out much sharper, even with a foreground. Most lenses are at their sharpest only two or three stops down from their largest aperture.

How to Tolerate DxO by Hacking MakerNotes and EXIF Tags

DxO Optics Pro was a purchase I immediately regretted making, once I realized that it intentionally restricts you from selecting what lens optics you’d like to adjust your photo with. It would take all of five minutes of programming to let the user decide, but for whatever stupid reason, if you’re using a different lens than the one they support OR if you are looking to adjust a photo that you’ve already adjusted in a different program, DxO becomes relatively useless.

I’ve figured out a couple easy ways to hack the tags in a raw image file to “fake” a different kind of lens. This worked for me. I make no guarantees it will work for you. In my case, I have a Canon 8-15mm Fisheye, which isn’t supported by DxO. The fixed 15mm Fisheye is, however, and since I only ever shoot at 15mm, I’d like to use the fixed module to correct. As it turns out, the module does a decent job once you fake DxO into thinking you actually used that lens.
Continue reading “How to Tolerate DxO by Hacking MakerNotes and EXIF Tags”

Forensics Tools: Stop Miscalculating iOS Usage Analytics!

There are some great forensics tools out there… and also some really crummy ones. I’ve found an incredible amount of wrong information in the often 500+ page reports some of these tools crank out, and often times the accuracy of the data is critical to one of the cases I’m assisting with at the time. Tools validation is critical to the healthy development cycle of a forensics tool, and unfortunately many companies don’t do enough of it. If investigators aren’t doing their homework to validate the information (and subsequently provide feedback to the software manufacturer), the consequences could mean an innocent person goes to jail, or a guilty one goes free. No joke. This could have happened a number of times had I not caught certain details.

Today’s reporting fail is with regards to the application “usage” information stored in iOS in the ADDataStore.sqlitedb file. At least a couple forensics tools are misreporting this data so as to be up to 26 hours or more off.
Continue reading “Forensics Tools: Stop Miscalculating iOS Usage Analytics!”

Thoughts on iMessage Integrity

Recently, Quarkslab exposed design flaws[1] in Apple’s iMessage protocol demonstrating that Apple does, despite its vehement denial, have the technical capability to intercept private iMessage traffic if they so desired, or were coerced to under a court order. The iMessage protocol is touted to use end-to-end encryption, however Quarkslab revealed in their research that the asymmetric keys generated to perform this encryption are exchanged through key directory servers centrally managed by Apple, which allow for substitute keys to be injected to allow eavesdropping to be performed. Similarly, the group revealed that certificate pinning, a very common and easy-to-implement certificate chain security mechanism, was not implemented in iMessage, potentially allowing malicious parties to perform MiTM attacks against iMessage in the same fashion. While the Quarkslab demonstration required physical access to the device in order to load a managed configuration, a MiTM is also theoretically possible by any party capable of either forging, or ordering the forgery of a certificate through one of the many certificate authorities built into the iOS TrustStore, either through a compromised certificate authority, or by court order. A number of such abuses have recently plagued the industry, and made national news[2, 3, 4].

Continue reading “Thoughts on iMessage Integrity”

Fingerprint Reader / PIN Bypass for Enterprises Built Into iOS 7

With iOS 7 and the new 5s come a few new security mechanisms, including a snazzy fingerprint reader and a built-in “trust” mechanism to help prevent juice jacking. Most people aren’t aware, however, that with so much new consumer security also come new bypasses in order to give enterprises access to corporate devices. These are in your phone’s firmware, whether it’s company owned or not, and their security mechanisms are likely also within the reach of others, such as government agencies or malicious hackers. One particular bypass appears to bypass both the passcode lock screen as well as the fingerprint locking mechanism, to grant enterprises access to their devices while locked. But at what cost to the overall security of consumer devices?

Continue reading “Fingerprint Reader / PIN Bypass for Enterprises Built Into iOS 7”

Injecting Reveal With MobileSubstrate

Reveal is a cool prototyping tool allowing you to perform runtime inspection of an iOS application. At the moment, its functionality revolved primarily around user interface design, allowing you to manage user interface objects and their behavior. It is my hope that in the future, Reveal will expand to be a full featured debugging tool, allowing pen-testers to inspect and modify instance variables in memory, instantiate new objects, invoke methods, and generally hack on the runtime of an iOS application. At the moment, it’s still a pretty cool user interface design aid. Reveal is designed to be linked with your project, meaning you have to have the source code of the application you want to inspect. This is a quick little instructional on how to link the reveal framework with any existing application on your iOS device, so that you can inspect it without source.

Continue reading “Injecting Reveal With MobileSubstrate”

How Juice Jacking Works, and Why It’s a Threat

How ironic that only a week or two after writing an article about pair locking,  we would see this talk coming out of Black Hat 2013, demonstrating how juice jacking can be used to install malicious software. The talk is getting a lot of buzz with the media, but many security guys like myself are scratching our heads wondering why this is being considered “new” news. Granted, I can only make statements based on the abstract of the talk, but all signs seem to point to this as a regurgitation of the same type of juice jacking talks we saw at DefCon two years ago. Nevertheless, juice jacking is not only technically possible, but has been performed in the wild for a few years now. I have my own juice jacking rig, which I use for security research, and I have also retrofitted my iPad Mini with a custom forensics toolkit, capable of performing a number of similar attacks against iOS devices. Juice jacking may not be anything new, but it is definitely a serious consideration for potential high profile targets, as well as for those serious about data privacy.

Continue reading “How Juice Jacking Works, and Why It’s a Threat”

Free Download: iOS Forensic Investigative Methods

Given the vast amount of loose knowledge now out there in the community, and the increasing number of commercial tools available to conduct both law enforcement and private sector acquisition of an iOS device, I’ve decided to make my law enforcement guide, “iOS Forensic Investigative Methods” freely available to all. The manual contains a lot of useful low-level information about the iPhone and the different artifacts commonly found while performing a forensic analysis. Chapter 3 requires a set of tools to actually image the device (which are not presently available to the public), however there are a number of commercial and open source tools that can be substituted here to acquire a disk image. Anyone with a little experience should be able to figure out by now how to get a copy of their own device’s disk. There is plenty of knowledge in this manual to teach you the basics of where to find information, such as various caches and other data, on the device, and what kind of evidence to look for when conducting investigations. It may also give the informal geek a peek down the rabbit hole to see just what kind of data is stored by your device.  I decided to release this for the betterment of technical knowledge in the community, so enjoy!

PDF: iOS Forensic Investigative Methods

iOS Counter Forensics: Encrypting SMS and Other Crypto-Hardening

As I explained in a recent blog post, your iOS device isn’t as encrypted as you think. In fact, nearly everything except for your email database and keychain can (and often is) recovered by Apple under subpoena (your device is either sent to or flown to Cupertino, and a copy of its hard drive contents are provided to law enforcement). Depending on your device model, a number of existing forensic imaging tools can also be used to scrape data off of your device, some of which are pirated in the wild. Lastly, a number of jailbreak and hacking tools, and private exploits can get access to your device even if it is protected with a passcode or a PIN. The only thing protecting your data on the device is the tiny sliver of encryption that is applied to your email database and keychain. This encryption (unlike the encryption used on the rest of the file system) makes use of your PIN or passphrase to encrypt the data in such a way that it is not accessible until you first enter it upon a reboot. Because nearly every working forensics or hacking tool for iOS 6 requires a reboot in order to hijack the phone, your email database file and keychain are “reasonably” secure using this better form of encryption.

While I’ve made remarks in the past that Apple should incorporate a complex boot passphrase into their iOS full disk encryption, like they do with File Vault, it’s fallen on deaf ears, and we will just have to wait for Apple to add real security into their products. It’s also beyond me why Apple decided that your email was the only information a consumer should find “valuable” enough to encrypt. Well, since Apple doesn’t have your security in mind, I do… and I’ve put together something you can do to protect the remaining files on your device. This technique will let you turn on the same type of encryption used on your email index for other files on the device. The bad news is that you have to jailbreak your phone to do it, which reduces the overall security of the device. The good news is that the trade-off might be worth it. When you jailbreak, not only can unsigned code run on the device, but App Store applications running inside the sandbox will have access to much more personal data they previously didn’t have access to, due to certain sandbox patches that are required in order to make the jailbreak work. This makes me feel uneasy, given the amount of spyware that’s already been found in the App Store… so you’ll need to be careful what you install if you’re going to jailbreak. The upside is that , by protecting other files on your device with Data-Protection encryption, forensic recovery will be highly unlikely without knowledge of (or brute forcing) your passphrase. Files protected this way are married to your passphrase, and so even with physical possession of your device, it’s unlikely they’d be recoverable.

Continue reading “iOS Counter Forensics: Encrypting SMS and Other Crypto-Hardening”

Introduction to iOS Binary Patching (Part 1)

Part of my job as a forensic scientist is to hack applications. When working some high profile cases, it’s not always that simple to extract data right off of the file system; this is especially true if the data is encrypted or obfuscated in some way. In such cases, it’s sometimes easier to clone the file system of a device and perform what some would call “forensic hacking”; there are often many flaws within an application that can be exploited to convince the application to unroll its own data. We also perform a number of red-team pen-tests for financial/banking, government, and other customers working with sensitive data, where we (under contract) attack the application (and sometimes the servers) in an attempt to test the system’s overall security. More often than not, we find serious vulnerabilities in the applications we test. In the time I’ve spent doing this, I’ve seen a number of applications whose encryption implementations have been riddled with holes, allowing me to attack the implementation rather than the encryption itself (which is much harder).

There are a number of different ways to manipulate an iOS application. I wrote about some of them in my last book, Hacking and Securing iOS Applications . The most popular (and expedient) method involves using tools such as Cycript or a debugger to manipulate the Objective-C runtime, which I demonstrated in my talk at Black Hat 2012 (slides). This is very easy to do, as the entire runtime funnels through only a handful of runtime C functions. It’s quite simple to hijack an application’s program flow, create your own objects, or invoke methods within an application. Often times, tinkering with the runtime is more than enough to get what you want out of an application. The worst example of security I demonstrated in my book was one application that simply decrypted and loaded all of its data with a single call to an application’s login function, [ OneSafeAppDelegate userIsLogged: ]. Manipulating the runtime will only get you so far, though. Tools like Cycript only work well at a method level. If you’re trying to override some logic inside of a method, you’ll need to resort to a debugger. Debugging an application gives you more control, but is also an interactive process; you’ll need to repeat your process every time you want to manipulate the application (or write some fancy scripts to do it). Developers are also getting a little trickier today in implementing jailbreak detection and counter-debugging techniques, meaning you’ll have to fight through some additional layers just to get into the application.

This is where binary patching comes in handy. One of the benefits to binary patching is that the changes to the application logic can be made permanent within the binary. By changing the program code itself, you’re effectively rewriting the application. It also lets you get down to a machine instruction level and manipulate registers, arguments, comparison operations, and other granular logic. Binary patching has been used historically to break applications’ anti-piracy mechanisms, but is also quite useful in the fields of forensic research as well as penetration testing. If I can find a way to patch an application to give me access to certain evidence that it wouldn’t before, then I can copy that binary back to the original device (if necessary) to extract a copy of the evidence for a case, or provide the investigator with a device that has a permanently modified version of the application they can use for a specific purpose. For our pen-testing clients, I can provide a copy of their own modified binary, accompanied by a report demonstrating how their application was compromised, and how they can strengthen the security for what will hopefully be a more solid production release.

Continue reading “Introduction to iOS Binary Patching (Part 1)”

On Expectation of Privacy

Many governments (including our own, here in the US) would have its citizens believe that privacy is a switch (that is, you either reasonably expect it, or you don’t). This has been demonstrated in many legal tests, and abused in many circumstances ranging from spying on electronic mail, to drones in our airspace monitoring the movements of private citizens. But privacy doesn’t work like a switch – at least it shouldn’t for a country that recognizes that privacy is an inherent right. In fact, privacy, like other components to security, works in layers. While the legal system might have us believe that privacy is switched off the moment we step outside, the intent of our Constitution’s Fourth Amendment (and our basic right, with or without it hard-coded into the Constitution) suggest otherwise; in fact, the Fourth Amendment was designed in part to protect the citizen in public. If our society can be convinced that privacy is a switch, however, then a government can make the case for flipping off that switch in any circumstance they want. Because no-one can ever practice perfect security, it’s easier for a government to simply draw a line at our front door. The right to privacy in public is one that is being very quickly stripped from our society by politicians and lawyers. Our current legal process for dealing with privacy misses one core component which adds dimension to privacy, and that is scope. Scope of privacy is present in many forms of logic that we naturally express as humans. Everything from computer programs to our natural technique for conveying third grade secrets (by cupping our hands over our mouth) demonstrates that we have a natural expectation of scope in privacy.

Continue reading “On Expectation of Privacy”

Your iOS device isn’t as encrypted as you think

This should help clear up the common misconception that data is encrypted and secured in iOS. While it’s true that iOS does sport an encrypted file system, that file system is virtually always unlocked from the moment the operating system boots up, as the OS (and your applications) need access to it. Even when the device is locked with your PIN or passphrase, the encrypted file system is readable to the operating system – what this means is that your data is NOT encrypted using an encryption that depends on your password – at least for the most part. Apple adds a second layer of encryption on top of this file system called Data-Protection. Apple’s Data-Protection encryption has the ability to protect a file while the device is locked by encrypting it with a key that is only available when you’ve entered your PIN or passphrase. While a PIN can be brute forced, a passphrase is much stronger.

So what’s the problem? Well, as of even the latest versions of iOS, the only files protected with this secondary encryption is your mail index, the keychain itself, and third party application files specifically tagged (by the developer) as protected with Data-Protection. Virtually everything else (your contacts, SMS, spotlight cache, photos, and so on) remain unprotected. To demonstrate this, I’ve put together a small recipe you can run on your own jailbroken device to bypass the lock screen. You can then use the GUI to browse through all of the data on the device, without ever providing your PIN. The only thing you’ll not be able to access are the files I’ve just mentioned. This lock screen bypass isn’t really a vulnerability in and of itself; it’s just one of many ways I can demonstrate to you that you don’t need a passphrase to view a vast majority if the data on your phone.

Continue reading “Your iOS device isn’t as encrypted as you think”

On Mental Health: How Medical Privacy Laws Destroyed Dad

I don’t normally write about such personal topics as family illnesses, but it is my hope that those who have gone through a similarly dark cooridor in their life – whether as a result of government control, or just plain ignorant doctors – would know that they are not alone in such frustrations, and to explain to the general oblivious public and incompetent lawmakers the consequences of their actions.

Continue reading “On Mental Health: How Medical Privacy Laws Destroyed Dad”

OnStar Reverses Privacy Decision: Or Did They?

OnStar today announced the reversal of their original decision to keep the customer’s data connection active to their vehicle after canceling service. The verbiage in the press release is ambiguous, however, and poses the question of whether OnStar is going to amend that specific portion of their new terms and conditions, or if they’re scrapping their new terms of conditions entirely.

If OnStar is only modifying this portion of their updated terms and conditions, then a major problem still exists: namely, the updated T&C, scheduled to go into effect in December 2011, would still grant OnStar broad new rights to collect the GPS positioning information about active customers, “for any purpose, at any time” and would still reserve OnStar the rights to sell access to this data to third parties.

Continue reading “OnStar Reverses Privacy Decision: Or Did They?”

OnStar Begins Spying On Customers’ GPS Location For Profit?

I canceled the OnStar subscription on my new GMC vehicle today after receiving an email from the company about their new terms and conditions. While most people, I imagine, would hit the delete button when receiving something as exciting as new terms and conditions, being the nerd sort, I decided to have a personal drooling session and read it instead. I’m glad I did. OnStar’s latest T&C has some very unsettling updates to it, which include the ability to now collect your GPS location information and speed “for any purpose, at any time”. They also have apparently granted themselves the ability to sell this personal information, and other information to third parties, including law enforcement. To add insult to a slap in the face, the company insists they will continue collecting and selling this personal information even after you cancel your service, unless you specifically shut down the data connection to the vehicle after canceling. This could mean that if you buy a used car with OnStar, or even a new one that already has been activated by the dealer, your location and other information may get tracked by OnStar without your knowledge, even if you’ve never done business with OnStar.

Continue reading “OnStar Begins Spying On Customers’ GPS Location For Profit?”