As he unveiled the iPhone 6 and iOS 8 operating system, CEO Tim Cook made a very bold statement: Apple's new encryption system makes it so the company can't read the data on our phones, even if the police try to compel it to do so. Law enforcement had a predictable freak out in the wake of this announcement. Privacy activists cheered. But what does Apple's new encryption policy really mean for phone users' privacy?
It depends on your adversary, and the kind of information you want to protect. The new encryption system will guard against a variety of potential snoops, including bad boyfriends, criminal hackers, and other jerks. It may even protect phones from unwarranted police searches. But that's sort of beside the point if you want to protect your text messages and records of your movements. After all, as Marcy Wheeler says, lots of the information police are most interested in is stored by our cell phone providers: who we call, where we go, and when. Cops routinely obtain this information from our providers without warrants, and there's nothing Apple can or will do about that.
On the other hand, AT&T and Verizon don't store the photos we take and keep locally on our devices. To my knowledge, these companies don't store detailed information on our app use, or the contents of our emails when they are stored locally on our phones. So as long as we don't use iCloud, those kinds of private things are a lot more secure today than they were last week.
But there are other issues at play here that complicate Apple's announcement, among them the border search and police use of cell phone sniffers.
While a 2014 Supreme Court decision in Riley affords US persons the right to keep our cell phones private from law enforcement absent a warrant, even subsequent to arrest, it doesn't say anything about the lawless border regions. The federal government claims we have no Fourth Amendment rights within 100 miles of any land or sea border, including at airports. For years now, DHS officials have used that totalitarian power to conduct warrantless searches of phones and laptops, bringing them into back rooms where they presumably sometimes try to make copies of people's hard drives. All of this is done without warrants, and with minimal—if any—oversight.
One of the tools feds likely use in back rooms at airports and border crossings throughout the United States is a device sold by a corporation called Cellebrite, which sells phone data extraction tools to every level of law enforcement. These devices, one of which is called "UFED", allow police or federal agents to directly siphon the contents of cell phones, sometimes even bypassing encryption locked with a user pin. The UFED page on Cellebrite's website boasts the technology can conduct "Physical extraction and decoding while bypassing pattern lock / password / PIN from Android devices including Samsung Galaxy S family, LG, HTC, Motorola, and more." Is the device able to break Apple's encryption, or hack beyond the password using a brute force style attack? It's too early to tell. But the company says law enforcement should rest easy with its purchase because the spy firm makes "[f]requent updates to ensure compatibility with new phones as they enter the market."
That begs the question: Is Tim Cook telling the truth when he says that Apple can't decrypt our phones? If there is truly no "backdoor" into the phones, could it be possible that the new encryption system will block technologies like Cellebrite's UFED? If so, it's understandable that lots of federal agents are very upset right now. A huge trove of "free", no-warrant-required surveillance information was just snatched from their grasp.
But then there's the cell phone sniffer, known as a Stingray or IMSI catcher. These devices enable law enforcement (or anyone else) not only to surreptitiously track the location of cell phones, but also intercept and even modify their contents and communications. Will Apple's encryption system stop the FBI from sucking up the contents of your communications, if you're sending sexts to your boyfriend while they're parked down the street with an IMSI catcher targeting your phone? No.
So is Apple's announcement a good thing for privacy, overall? Definitely. Does it protect your private cell phone information from being disclosed to the police? For many types of sensitive information, not at all. For photos and other files stored locally, potentially—as long as there isn't a cop outside with a Stingray, waiting for you to start taking pictures or sending emails.
Building digital privacy protections into technologies is exactly what companies should be doing, and Apple deserves credit for this bold move. But technical measures alone won't cut it.
If we want to make sure our private lives stay private at the border and everywhere else, we need to change US surveillance law at the federal and state level to reflect basic Fourth Amendment values. If cops can simply file a subpoena to obtain our call and location records, they don't need to bother with searching our physical phones. And if the FBI or our local police departments can just haul out the Stingray to suck up the contents of your device, Apple's privacy maneuver won't mean much for the sanctity of your private life. While the company deserves credit for doing its part, these are political problems that Apple alone cannot fix.