Privacy SOS

In its war against encryption, the FBI has found a backdoor to get around congress

For years, the FBI has begged congress to address what it says is a “going dark” problem. In July 2015, FBI director James Comey told the Senate Judiciary Committee law enforcement needs broader powers to force companies to build “backdoors” into their encrypted systems. Members of congress and security experts alike disagree, arguing that it’s impossible to build a backdoor for the FBI without making the technologies we rely upon insecure to attacks from malicious hackers, thieves, foreign governments, and others who seek to exploit our personal information. 

Now, what had long been primarily a war of words has burst into the very real, precedent-setting courtroom. On Tuesday, the FBI secured a court order from a federal magistrate judge in California ordering Apple to write new software code that will enable the FBI to download data from an encrypted iPhone. (Here’s a great technological breakdown of the issues; here’s a great legal breakdown. Read the government’s motion to force Apple to hack its products here; the judge’s order here; and Tim Cook’s letter to Apple customers here.)

The phone in question belonged to one of the San Bernardino shooters. It’s unclear exactly what data the FBI needs to get off of the phone that it couldn’t get from Verizon, Facebook, or other providers, but nonetheless its arguments to the court prevailed. Apple has only a few more days to challenge the ruling, which relies upon an 18th century law called the All Writs Act. That statute allows courts to make law in situations where congress hasn’t spoken. So while Jim Comey and his FBI failed to get congress to authorize broad new government hacking powers to weaken digital security, it has, for now, succeeded in gaining a court’s authorization to do the same, ironically based on a statute that gives courts authority to act where congress hasn’t. (The statute makes no mention of situations in which congress hasn’t acted on an issue on purpose.)

The stakes are incredibly high. As my colleague Alex Abdo told DemocracyNow! this morning, in the video below, this legal battle isn’t just about the contents of one iPhone. Indeed, it’s about far more than the contents of every iPhone. If the government emerges from this fight victorious, it will mean that local police and FBI officials can force companies to hack their own products on behalf of the government. That would be devastating not just for US and world technology users, but also for US technology companies, which could no longer guarantee secure products to their users. 

Please note that by playing this clip YouTube and Google will place a long term cookie on your computer.

Here are the top five reasons the courts must disagree with the government, and rescind this judge’s order to Apple:

1. The order is unconstitutional.

As ACLU attorney Abdo argues in a Time op-ed, ordering Apple to write new software to make its own products insecure violates the company’s Fifth Amendment rights. Code is speech. The government cannot compel speech.

[T]he government’s legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations—but only in obtaining information or evidence the companies already have access to.

The difference between those cases and Apple’s is a radical one. If Apple and other tech companies—whose devices we all rely upon to store incredibly private information—can be forced to hack into their customers’ devices, then it’s hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there’s no way to ensure that it’s only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.

 2. “Either everyone gets security, or no one does.”

Security expert Bruce Schneier says the government is wrong when it claims this order is only about one iPhone. In truth, the order, if allowed to stand, would set a precedent that could undermine the security of all American technologies.

Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.

3. The FBI could probably hack the phone on its own. What it wants is legal precedent to undo commercial security technologies.

Buzzfeed’s Sheera Frenkel spoke with a number of former US national security officials who say the NSA, and probably the FBI, have the tools necessary to hack into this iPhone themselves. The reason they went to court to get an order to force Apple to do the hacking therefore isn’t because they need access to information on this particular phone. They want the court order so they can force companies to break their encryption technologies in routine criminal investigations, also. After all, hacking into Apple products is expensive, even for powerful, well-funded agencies like the NSA and FBI. It’s much easier for the government to simply make corporations do that work for them.

In interviews with BuzzFeed News Wednesday, the former officers with the FBI and NSA acknowledged that U.S. intelligence agencies have technology that has been used in past intelligence-gathering operations to break into locked phones. The question, they added, was whether it was worthwhile for the FBI to deploy that technology, rather than setting a precedent through the courts.

“There are capabilities that the U.S. government has, that are used for intelligence collecting only and that wouldn’t be used for a criminal matter because they would have to come up in open court,” said Austin Berglas, a former Assistant Special Agent in charge of the FBI’s New York Cyber Branch who is now head of Cyber Investigations and Incident Response at the private consultancy firm K2 Intelligence.

4. We trust automatic updates to keep us secure. If allowed to stand, this order would mean we couldn’t trust those updates—nor our devices.

Security updates are automatically pushed to computers, phones, and internet of things connected devices including medical implants and cars. We rely on those updates to keep us secure from hackers and thieves. If the FBI gets its way in this case, we will no longer be able to trust the updates that come signed from Microsoft, Apple, or Google. 

Nicholas Weaver explains:

The request to Apple is accurately paraphrased as “Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism”.  (I speak of malcode in the technical sense of “code designed to subvert a security protection or compromise the device”, not in intent.)

The same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn’t yet in law enforcement’s hand.  So the precedent the FBI seeks doesn’t represent just “create and install malcode for this device in Law Enforcement possession” but rather “create and install malcode for this device”.

Let us assume that the FBI wins in court and gains this precedent.  This does indeed solve the “going dark” problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say “push out an update to this target”.  Once the target’s device starts running the FBI’s update then encryption no longer matters, even the much stronger security present in the latest Apple devices. 

5. These powers start with extreme cases, but will end up impacting everyone.

Like with the USA Patriot Act’s “sneak and peek” warrants, the use of secretive cell site simulators called stingrays, and no-knock SWAT raids, emergency-type powers government agencies first excuse on the basis of extreme situations inevitably trickle down to low-level enforcement and often become deeply engrained in domestic police practice. What’s justified here on the basis of protecting national security from terrorists will ultimately be used to arrest people for using and selling drugs, or even more minor crimes, like refusing to pay your student loans. This case isn’t about one phone, and it’s not about terrorism. It’s about giving the government the power to force technology companies to do their work for them, by hacking into our private machines and giving the government full control of them. It starts with this one phone in the hands of the FBI, but will not end there.

Update (2/19/16): I appeared on WGBH’s Greater Boston last night to discuss these issues. See the video below.

Please note that by playing this clip YouTube and Google will place a long term cookie on your computer.

© 2024 ACLU of Massachusetts.