When asked directly if the FBI wants a backdoor, [Amy] Hess [Asst. Director of FBI’s Science & Technology branch] dodged the question and did not describe in detail what actual solution the FBI is seeking.
“We are simply asking for information that we seek in response to a lawful order in a readable format,” Hess responded, while also repeating that the Bureau supports strong encryption. “But how that actually happens should be the decision of the provider.”
When pressed again, Hess said that it would be okay for the FBI not to have a key to decrypt data, if the provider “can get us that information by maintaining the key themselves.”
That’s asking the impossible — for a great many reasons. First and foremost, compromised encryption is compromised encryption. It can be exploited by criminals and other unwanted entities just as certainly as it can assist law enforcement agencies in obtaining the information they’re seeking. There’s no way around this fact. You cannot have “good guys only” encryption.
But beyond that, even if the FBI manages to get what it wants, it will do so at the expense of general computing. Requiring built-in backdoors or key escrow will dismantle the very systems it’s meant to access. Computer scientist Jonathan Mayer delivers a long, detailed hypothetical involving the Android platform and how the FBI’s desired access would fail — and do severe collateral damage — every step of the way. (via Boing Boing)
First off, if Google gives the FBI the backdoors it wants, that only nails down Google. But Google also distributes thousands of third-party apps through its Play store. And these apps may not contain the subverted encryption the FBI is looking for. Now, Google has to be in the business of regulating third-party apps to ensure they meet the government’s standard for compromised encryption.
The obvious answer is that Google can’t stop with just backdooring disk encryption. It has to backdoor the entire Android cryptography library. Whenever a third-party app generates an encrypted blob of data, for any purpose, that blob has to include a backdoor.
This move may work, but it only affects apps using Google’s encryption. Other offerings may rely on other encryption methods. Then what? It has a few options, all of them carrying horrendous implications.
One option: require Google to police its app store for strong cryptography. Another option: mandate a notice-and-takedown system, where the government is responsible for spotting secure apps, and Google has a grace period to remove them. Either alternative would, of course, be entirely unacceptable to the technology sector—the DMCA’s notice-and-takedown system is widely reviled, and present federal law (CDA 230) disfavors intermediary liability.
At this point, Mayer suggests the “solution” is already outside the realm of political feasibility. Would the FBI really push this far to obtain encryption backdoors? The FBI itself seems unsure of how far it’s willing to go, and many officials quoted (like the one above) seem to think all the FBI really needs to do is be very insistent on this point, and techies will come up with some magical computing solution that maintains the protective qualities of encryption while simultaneously allowing the government to open the door and have a look around any time it wants to.
So, if the FBI is willing to travel this very dark road littered with an untold amount of collateral damage, it still hasn’t managed to ensure the phones it encounters will open at its command. Considering phone users could still acquire apps from other sources, the government’s reach would only extend as far as the heavily-policed official app store (and other large competitors’ app stores). Now what? More government power and less operational stability.
The only solution is an app kill switch. (Google’s euphemism is “Remote Application Removal.”) Whenever the government discovers a strong encryption app, it would compel Google to nuke the app from Android phones worldwide. That level of government intrusion—reaching into personal devices to remove security software—certainly would not be well received. It raises serious Fourth Amendment issues, since it could be construed as a search of the device or a seizure of device functionality and app data. What’s more, the collateral damage would be extensive; innocent users of the app would lose their data.
Even if the government were willing to take it this far, it still doesn’t eradicate apps that it can’t crack. (But it may be sufficient to only backdoor the most used apps, which may be all it’s looking to achieve…) App creators could decide to avoid Google’s government-walled garden and mandated kill switch by assigning random identifiers and handling a majority of the app’s services (like a messaging service, etc.) via a website, out of reach of app removal tools and government intervention. To stop this, the US government would need to do the previously unimaginable:
In order to prevent secure data storage and end-to-end secure messaging, the government would have to block these web apps. The United States would have to engage in Internet censorship.
Robert Graham at Errata Security makes similar points in his post on the subject, but raises a couple of other interesting (in the horrific train wreck meaning of the word) points. While the government may try to regulate the internet, it can’t (theoretically) touch services hosted in foreign countries. (Although it may soon be able to hack away at them with zero legal repercussions…)
Such services could be located in another country, because there are no real national borders in cyberspace. In any event, such services aren’t “phone” services, but instead just “contact” services. They let people find each other, but they don’t control the phone call. It’s possible to bypass such services anyway, by either using a peer-to-peer contact system, or overloading something completely different, like DNS.
Like crypto, the entire Internet is based on the concept of end-to-end, where there is nothing special inside the network that provides a service you can regulate.
The FBI likely has no desire to take its fight against encryption this far. The problem is that it thinks its “solution” to encryption is “reasonable.” But it isn’t.
The point is this. Forcing Apple to insert a “Golden Key” into the iPhone looks reasonable, but the truth is the problem explodes to something far outside of any sort of reasonableness. It would mean outlawing certain kinds of code — which is probably not possible in our legal system.
The biggest problem here is that no one arguing for “golden keys,” key escrow, “good guy” backdoors, etc. seems to have any idea what implementing this could actually result in. They think it’s just tech companies sticking it to The Man, possibly because a former NSA sysadmin went halfway around the world with a pile of documents and a suitcase of whistles with “BLOW ME” printed on the side.
But it isn’t. And their continual shrugged assertion that the “smart guys” at tech companies will figure this all out for them is not only lazy, it’s colossally ignorant. There isn’t a solution. The government can’t demand that companies not provide encryption. It’s not willing to ban encryption, nor is it in any position to make that ban stick. It doesn’t know what it needs. It only knows what it wants. And it can’t have what it wants — not because no one wants to give it to them — but because no one can give it to them.
Yes, many tech companies are far more wary of collaborating with the government in this post-Snowden era, but in this case, the tech world cannot give the FBI what it wants without destroying nearly everything surrounding the “back door.” And continually trotting out kidnappers, child porn enthusiasts and upskirt photographers as reasons for breaking cell phone platforms doesn’t change the fact that it cannot be done without potentially harming every non-criminal phone owner and the services they use.