In the wake of the Snowden disclosures, U.S. tech companies faced a crisis in user confidence. Europeans, in particular, were outraged that the companies behind their favorite gadgets and edge services were in fact turning over droves of personal information to the U.S. government, without regard for individual privacy, all in the name of protecting national security. Under increasing pressure from users, major players like Apple and Google responded by building strong, end-to-end encryption into their devices and services.
These moves were extremely popular with users, who saw no reduction in service quality and had their fears of government surveillance and privacy violations somewhat assuaged, but they were not popular with everyone. In particular, law enforcement and intelligence agencies were infuriated that they would no longer be able to indiscriminately obtain users’ messages and personal data with impunity. Thus, in multiple hearings before Congress, FBI Director James Comey repeatedly pled for a legislative or regulatory fix that would outlaw end-to-end encryption and force tech companies to install mandatory backdoors and other security loopholes –– the kind of zero-day vulnerabilities exploited in massive hacks like that on Apple’s iCloud –– that the government could hold the keys to and take advantage of whenever needed.
In theory, such access could help law enforcement and intelligence agencies to do their jobs more efficiently and better protect the U.S. from future attacks. But the inconvenient truth is this: There’s no such thing as a data security vulnerability that only the “good guys” can access. Installing such vulnerabilities may enable better surveillance, but it would also subject the companies –– and their users –– to greatly increased risk of data breaches and other security hacks, potentially exposing valuable trade secrets, intellectual property, and private personal data to malicious actors (state-backed or otherwise).
Caught between a rock (weakening national security) and a hard place (protecting national security at the expense of personal privacy and economic security), President Obama initially sat on his hands. Numerous coalition letters and petitions were sent to the President urging him to support strong encryption –– considered by many to be the equivalent of the 2nd Amendment for the Internet –– and he eventually agreed not to seek legislation mandating data security vulnerabilities. Following months of silence, this came as a great relief to many U.S. tech companies, who rightfully lauded the move. After all, accepting the inconvenient truth as such, the only good answer here is to allow tech companies to use strong encryption to protect their data and the data of their users. Law enforcement and intelligence agencies will find other ways to meet their responsibilities –– namely, hiring sophisticated hackers to break through even the strongest forms of encryption, and giving them the tools needed to do so (like the Exascale computer President Obama recently commissioned). Of course, such a strategy inevitably leads to a computing arms race between rival countries, but, like it or not, such a race is already well underway.
However, while those who favor this course of action will have been pleased to hear President Obama refrain from calling for legislation to mandate backdoors in tech products, it is hardly the end of the story here. President Obama (or his successor) may still seek encryption and data security standards with built-in vulnerabilities through other means. In particular, while the Federal Trade Commission had asked Congress for specific rulemaking authority to develop such standards in recent years, the Federal Communications Commission –– by way of its recent reclassification of ISPs as Title II common carriers –– has general administrative rulemaking authority, which it could now use to issue encryption and data security standards to govern the devices and services consumers use on the Internet. It is widely expected that the FCC will launch a privacy rulemaking later this year, and given the shifting sands of the current regulatory landscape, it’s anyone’s guess as to how it will end up. We may wind up with mandatory backdoors and other security vulnerabilities after all — with or without legislation.
Tom Struble is Policy Counsel at TechFreedom and a member of the inaugural class of Foundry Fellows at the Internet Law & Policy Foundry. Struble obtained his J.D. from the George Washington University Law School, and B.A.s in Psychology and Political Science from the University of Kansas. He resides in Washington, DC and is an active member of the District of Columbia Bar.