The FBI has been in the news recently with its campaign to force technology companies to modify their encryption methods to make it easier to snoop on users’ messages. According to the FBI, this is a commonsense change: after all, criminals rely on these encrypted services to communicate secretly, and the constitution grants them the right to seize information (like letters and documents) when a lawful warrant is presented.
So why not give the FBI the ability to read encrypted documents?
If we fail to take these steps, the FBI warns, criminals and terrorists will continue to ‘go dark’ – to move their traffic to anonymous messaging platforms which cannot be accurately surveilled. This, they say, leaves the balance of power in the hands of bad people. The government of the United Kingdom has made similarly dire warnings.
According to FBI director James Comey:
“ISIL’s M.O. is to broadcast on Twitter, get people to follow them, then move them to Twitter Direct Messaging” to evaluate if they are a legitimate recruit, he said. “Then they’ll move them to an encrypted mobile-messaging app so they go dark to us.”
The FBI isn’t making idle chatter, either. The next battle in the privacy war is happening, largely in secret, at Apple, Inc. The CEO, Tim Cook, has been making increasingly angry pronouncements about user privacy, including the following:
“[T]here were rumors and things being written in the press that people had backdoors to our servers. None of that is true. Zero. We would never allow that to happen. They would have to cart us out in a box before we would do that.”
Rumors have been circulating for a while that the FBI is attempting to pressure the company to add “back-doors” to the encryption in their products and services (like the iPhone and iMessaging). Now, there’s some public evidence that this is occurring. The Department of Justice has issued a court order to Apple, demanding that they turn over real-time messaging logs between two suspects in a gun-and-drug-crime case. Apple has refused, saying that even they cannot crack user encryption – after all, that’s the point.
The response from the FBI and DoJ has been that they are considering taking Apple to court over it – presumably to try to obtain a court order to force the company to backdoor their encryption. ZDNet warns that if this happens, Apple could be forced to capitulate. It wouldn’t be the first time something like this has happened: In 2014, Yahoo was finally able to discuss its secret FISA court dealings with PRISM, the government surveillance program revealed by Edward Snowden, back in the early 2000s. When it refused to turn over user information, Yahoo was faced with huge, secret fines – $250,000 each day, doubling every month. For context, the daily fine would have exceeded the $74 trillion dollar world GDP in just over two years.
Even ignoring the various moral and constitutional concerns with this sort of thing, there are also plenty of technical problems with the proposal. Since the FBI began pushing for their encryption backdoors, a number of security researchers have come forward to point out some basic flaws in the whole concept.
For starters, any mainstream security expert will tell you that the “secure backdoors” that Comey wants do not actually exist. The director of the Federal Trade Commission has outright said the proposal is a bad idea. There is no way to leave a backdoor in any strong encryption scheme without seriously damaging the overall security. Cryptosystems that have that property simply don’t exist. Cryptography is not something that can be willed into existence.
TechDirt has a pretty wonderful article ripping this idea apart. A choice quote:
“[I]t’s fairly stunning that Comey keeps insisting that those bright minds in Silicon Valley can sprinkle some magic pixie dust and give him what he wants, but at the same time claims it’s too difficult for the FBI to actually quantify how big a problem encryption is for its investigations. Furthermore, he can’t even provide a single real world example for where encryption has been a real problem.”
Even if such a system could be created, there’s another serious flaw: we’d be giving that key to the government. The same government that can’t even keep their own personnel records safe, and went months or years without noticing they had been hacked. Edward Snowden, a contractor, walked out with details of the NSA’s most sensitive dealings. How many hours do you think it would take for the Chinese government to obtain a copy of the key? How many days until it shows up on the dark net and any half-competent hacker can access anyone’s bank account? The computer security standards within the American government aren’t even close to good enough to entrust them with the security of the entire Internet.
There is a larger issue, however, that goes straight to the heart of the whole argument: namely, that these sorts of backdoors don’t actually resolve the problem the FBI is supposedly concerned about. The FBI’s justification hinges on serious threats – not drug deals and purse-snatchers, but terrorists and human traffickers. Unfortunately, these are the people who will be least affected by these backdoors.
Terrorists and criminals are not ignorant of encryption. Those who use encryption do so on purpose, to avoid surveillance. It’s naive to think they won’t notice when the FBI succeeds in getting some kind of backdoor. Terrorists and arms dealers aren’t just going to continue to use a backdoored iMessenger – they’re going to switch to encrypted chat programs developed in other countries, or to open-source software whose security they can verify. Those vulnerable to the backdoors will be those too uninformed to use better computer security – aka, petty criminals and most law-abiding citizens.
Cryptographic backdoors are much more useful for snooping on your grandmother than they are for snooping on ISIL, and the FBI very likely knows this. So take their dire warnings about terrorists with a grain of salt about the size of Utah.
The FBI will argue of course, that there are no privacy issues raised for law-abiding citizens. After all, they would still need a warrant in order to access this information, just as they would to search your house or filing cabinet. However, these are not the uninformed, innocent days before the Snowden leaks.
We know about the existence of secret courts that issue secret warrants based on secret evidence. Those courts don’t refuse to issue warrants, because that’s not their purpose. They are rubber stamps in all but name. Asking us to trust our privacy to such a system is actively insulting.
Here’s security expert Bruce Schneier expressing a similar view on his blog, Schneier on Security,
“Imagine that Comey got what he wanted. Imagine that iMessage and Facebook and Skype and everything else US-made had his backdoor. The ISIL operative would tell his potential recruit to use something else, something secure and non-US-made. Maybe an encryption program from Finland, or Switzerland, or Brazil. Maybe Mujahedeen Secrets. Maybe anything.”
A History of Surveillance
This is not the first time various government agencies have tried something like this. In the 1990’s, the Clinton administration attempted to force the tech industry to install surveillance hardware in their devices – the so-called “Clipper chip,” which would allow government agencies to circumvent strong encryption.
In that case, too, vulnerabilities were found in the system, making the devices less secure, and allowing criminals to circumvent it in any case. The proposal was defeated. In an analysis called “Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications,” more than a dozen security researchers express the view that the new proposal would be even worse. From the abstract:
“Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels going dark, these attempts to regulate the emerging Internet were abandoned.
In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today we are again hearing calls for regulation to mandate the provision of exceptional access mechanisms.
In this report, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates. We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago.”
Too Much for Too Little
To sum up: encryption backdoors are a bad idea, technically and practically. They do not solve law enforcement’s big problems, but they do create new ones for consumers and anyone else reliant on security. Forcing them on the industry will be expensive beyond belief, and we will get almost nothing in return. They’re a bad idea, proposed in bad faith. And, with any luck, the growing clamor of voices against the idea will put a stop to them once again.
What do you think? Should the government have the power to compromise encryption? Let us know your thoughts in the comments!
Image Credits:blocking the doorway by Mopic via Shutterstock