The Decryption Dilemma
May 2022
Examining Whether the Government Should Have The Right To Obtain Decryption Keys
Technology is the new overlord. Across the cornucopia of platforms, services, softwares, and digital resources that billions access and rely on daily, cyberspace’s importance has all but eclipsed its real-life alternative. These titans of industry – Microsoft, Alphabet, Meta, Amazon, Apple and more - flaunt market capitalizations and revenues beyond the fathomable realm of possibility, revolutionizing every fundamental aspect of the human experience, distilling it to a state of permanent digital being.
Yet, despite the sheer magnitude of their presences, giant tech companies are perpetually under fire from governments. Whether the issue at hand is to do with censorship, disinformation, or being at cultural odds, tech companies have exited their adolescence to become that which the government seeks to regulate as much as possible – a fate inevitable for any technological phenomenon.
What makes this dynamic even more fascinating is the fact that the citizenry of, say, Facebook or Google know no boundaries. Granted, the companies are American with thanks to the freedoms that ensured their effervescence was ripe, but they comprise of offices and users from every corner of the world. Tech giants are only obligated to serve the interests of the countries they operate in under the interests of their users and customers: when all is said and done, they are first and foremost businesses.
The quagmire relationship between tech companies, their users, and government agencies comes to a climatic head when the following question is posited: Should governments be able to request decryption keys on protected devices or platforms, even in the presence of a probable cause for search?
While this paper does not draw on an advanced knowledge of the legal history or precedents, this question is emblematic of some of the predominant themes of our zeitgeist. Civil liberties groups, allying themselves with a John Barlow-esque, sovereignty-free perspective on the definition of cyberspace, exhort tech companies and individuals to not hand over decryption keys to government actors; simultaneously, compelling arguments surrounding national security interests and the preclusion of threats can be made for why the government should have access to decryption keys.
While both these claims have validity, for reasons outlined in the proximate sections, the government should not have ubiquitous, consistent access to decryption keys; and should only be granted them as a last resort with the stringency of “strict scrutiny”.
In understanding why the government should not have access to decryption keys, one must retrospectively begin by articulating the goal of the government. The ideological skirmish originates with the following statement, which has been brought to life as cyberspace has become impossible to avoid: those who manufacture encryption software on any device should program in a “backdoor” for the government to access the encrypted, protected information.
The origins of this in political rhetoric trace back to the development of the “Clipper Chip” in 1993; a tool manufactured by the National Security Agency (NSA) which would give the Clinton administration henceforth access to protected digital voice communications. Pioneering encryption and internet experts of the time voiced vehement opposition to the implementation of the Clipper Chip, arguing that the installation of the Chip would set a dangerous precedent of government surveillance – who is to say that the government should have access to privileges that the people don't have concerning their own information? Why should the government enjoy the ability to spy on its citizens, while it itself enjoys a paramount level of encryption? Moreover, the government cannot force individuals to install clipper chips onto existing devices, so how do they account for the plethora of technology already in use? To no great surprise, the Clipper Chip was quietly withdrawn three years later amidst vitriol. However, the Clipper Chip’s rise was essential in solidifying whether the government should have access to encrypted, protected data, and forever enshrines itself in the annals of cyberspace.
The progression of thought surrounding why the government should not have access merits examining key cases that pertain to search and seizure within cyberspace. The foundational, historical bedrock for what is eligible for search and seizure by the government is defined by the Fourth Amendment of the U.S. Constitution:
“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”
As the Amendments predate any remote notion of hyper-advanced technology by centuries, it is a fascinating exercise to apply their linguistic capacities to the cyberspace realm. There is an abundant amount to syntactically dissect: in the digital world, what constitutes “papers and effects”? How do locations disclosed, along the lines of “homes” and “places” to be searched, translate to the nebulous, inchoate cyber realm? Moreover, does the efficacy or lack thereof of the Fourth Amendment in the cyber realm empower the government to access decryption keys?
One case that brought these inquiries to the forefront was Carpenter Vs. United States (2018). Magnifying Carpenter also allows us to better understand the privacy provisions in place regarding data as well as a legal phenomenon known as the “third-party doctrine”, a crucial one at that.
In Carpenter, the issue at hand was whether the “warrantless search and seizure of cell-phone records, which include the movements and locations of the cell phone users, violate the Fourth Amendment”. It emerged following the apprehension of four men connected to a string of armed robberies across mobile stores; as the men were arrested, the Federal Bureau of Investigation (FBI) requested that a judge grant them transactional, location-based records belonging to the men under the Stored Communications Act, including those of the eponymous Timothy Carpenter. Carpenter counteracted this measure, citing a violation of the Fourth Amendment.
The Supreme Court of the United States, led by Chief Justice John Roberts, issued a 5-4 majority ruling, concluding that indeed, the Fourth Amendment was violated and that the actions of the FBI could be classified as unreasonable search and seizure. In turn, the government henceforth would need a.) a search warrant and b.) one with probable cause to access information of this sensitivity. Furthermore, the Court declined to extend the aforementioned “third-party doctrine”, which states that when consumers voluntarily cede their data to third party enterprises and services, their expectation for “reasonable privacy” has been forfeited.While Associate Justices Clarence Thomas and Samuel Alito filed dissenting opinions, arguing that the records were the property of the cell phone carriers and that Congress should adjudicate over these instances with the third-party doctrine front of mind, an intriguing precedent surrounding what can be searched and seized manifested. One particular line also ostensibly stands out in the ruling: “ Expectations of privacy in this age of digital data do not fit neatly into existing precedents.”
Understanding the Carpenter case is valid to the question of whether companies and citizens should hand over encrypted data to the government as it underscores that the limitations, definitions and expectations of the government’s stance on how search warrants within the digital epoch are necessary, and how the provisions of the Fourth Amendment are still resonant in cyberspace. Furthermore, when you upload a facsimile of your life and personal information onto a technology company’s servers, your expectation to be protected should diminish due to the third-party doctrine.
Many sources assert that, should the potency of Carpenter and the Fourth Amendment be attenuated and challenged by means of technology companies implementing a “backdoor” for government to access encrypted information, a pandora’s box of pedantic issues will follow. Firstly, cybersecurity experts believe that the digital infrastructure level of the government is not sufficient to be trusted with such delicate power, and that should malicious actors become cognizant that government agencies possess decryption keys, they will be relentlessly targeted. A landmark 2015 report on cryptography commissioned by the New York Times co-authored by notable cyber experts – including crypto primogenitor Whitfield Diffie, who testified to Congress regarding the Clipper Chip in the 1990s – finds that “such [government access] will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend” . The rate at which the complexity of cybercrime is proliferating poses a massive threat, especially as countries like Russia and China refuse to punitively clamp down on cybercrime groups within their borders.
On a similar note, the implementation of backdoors to encrypted data is just that - a backdoor, for more advanced nefarious actors to access. David Gerwitz, head of ZDNet and adviser to many government cybersecurity programs, excoriated this possibility; “anytime you build a flaw for the good guys…it leaves a gaping hole where the bad guys can get in…it’s far better for the tech industry to build as solid and powerful products as they can, without any built-in breach points, and then work with law enforcement constructively when there are tangible clues”. Gerwitz’s perspective is valid: why should tech companies ironically weaken their security standards for the sake of law enforcement, thus exposing unintended vulnerabilities that will be a haven for vindictive hackers? This would make for an amusingly problematic scenario for all involved, replete with reputational tarnish.
Furthermore, can the government be trusted to understand the technical complexities of decryption keys without the consultation of the diametrically opposed private sector? For starters, the government would have to coerce tech giants like Facebook and Apple to divagate from utilizing “forward secrecy”, which entails the creation of a unique, one-time key to unlock encryption – a “master key” for all time use does not exist, and justifiably so: this would be encroached upon aggressively by adversarial actors.
Forward secrecy is perceived as a positive marvel in the tech world, as it is programmed to constantly change its parameters of access automatically and frequently, thus ensuring maximum protection. It is analogous to having separate, complex passwords for every occasion a user begins a new session logging into a Gmail account. As tech companies continue to develop forward secrecy’s capabilities, the notion of the FBI or National Security Agency (NSA) having unfettered universal backdoor access becomes slimmer.
The issue of encryption sophistication and its centralization surfaces once more with the notion of “key escrow” - wherein a copy of an individual’s data lives within a third party’s servers, hypothetically permitting the government to access them via the tech company through means of a subpoena. This would however theoretically signify that tech companies would need to, to the subservience of the government, create “one bank of master keys…Who protects those keys?...it would require significant security to protect”.
A prevailing theme is salient: many of the desired measures that government agencies would like to have in place would subsequently undermine the entire aspect of privacy and encryption. Moreover, the gesture of encouraging tech companies to build backdoors at a time when hacker groups are causing consternation around the globe does not seem like a prescient plan. Finally, the technicalities of encrypted keys do not make for fertile access, and tech companies will struggle to reconcile their obligations to their users with the cacophonies of enraged government agencies.
That being said, there are situations of grave importance wherein the government being given unrestricted access seems appropriate. The tragic 2016 mass-shooting in San Bernardino, California swiftly insurrected a fervent, bellicose standoff between the FBI and tech behemoth Apple; rhetoric regarding tech companies’ obligation to disclose decryption keys came to a dramatic apex. Following the gruesome incident, in which two shooters embarked on a spree killing 14 people, the FBI ordered Apple to provide assistance in unlocking some of the encrypted softwares standard to all of their iPhone products. The software functionality in question was a feature which, after too many failed log-in attempts, the device – in a programmed maneuver to evade disclosing information – would erase itself. Apple CEO Tim Cook vociferously opposed the possibility of the California-headquartered company cooperating, proclaiming that doing so would catalyze a perilous precedent: “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge”. While Cook’s claims are consanguineous with the intentions of this paper, situations that can or will endanger several lives may merit limited government access. Terrorist attacks, threats to national security, and or affronts to critical infrastructure are unfortunately pervasive threats that require commensurate, high level intelligence to effectively combat, mitigate and curtail altogether. Being bereft of decryption keys can leave these agencies in the dark, and can prompt them to pursue unconventional, time-consuming means to accomplish their goals: The FBI ended up unlocking the desired iPhone, thanks to the expertise of a third-party cybersecurity firm named Azimuth. What does this say about the increasingly fractious, polarized relationship between tech companies and the government; that had Apple not worried about the looming precedent, it would have been able to facilitate justified law enforcement practices in the wake of an atrocious tragedy? Could the lack of access to encrypted devices seriously further the centripetal harmony that law enforcement agencies should enjoy in carrying out their mission? Exasperated voices are aplenty, as while search warrants and wiretap orders have precedent following the Carpenter case, officials are still subjugated by the elusivity of tech companies’ software: “We need to see what’s inside here,” said New York Police Department Deputy Counterterrorism Commissioner John Miller on news program “48 Hours”. “[We need to see it] just like we did in Mumbai, just like we did after 9/11…they’ll tell you, we can’t. [they] designed it to be uncrackable”.
A compelling reason for why tech companies should ameliorate these access concerns is simply because government agencies will eventually figure out how to do so independently at the expense of circumventing the companies themselves. Ira Goldstein, of cybersecurity giant Herjavec Group, alleges that the question is not if governments will attain access, but when – in vein with the FBI consulting Azimuth, following Apple’s pugilistic behavior. “Governments will always find other methods to monitor data in the name of national security”. Goldstein’s words invoke a reminiscence on the NSA’s covert PRISM surveillance program uncovered by Edward Snowden in 2013: if governments do not feel they have the elasticity and freedom to wilfully call upon tech companies to hand over decryption keys, they will likely renegade to create that which is unbeknownst to the public, to the chagrin of cultural, political, and judicial backlash.
Lastly, tech companies should occasionally cooperate with the government to reinvent a legislative framework that is not punitive to the company or the customer. Currently, the Lawful Access to Encrypted Data Act of 2020 is pending further resolution in the United States Congress; if passed, it would incontrovertibly and absolutely require tech companies to hand over decryption keys swiftly and without much recourse. This is to say that if tech companies refuse to adhere to legislation they did not even have power to approve, they could be penalized for wanting to safeguard their users. To further positive dialogue between tech companies and government entities, a new legislative framework should be introduced which does not – for the government’s own cybersecurity sake – bestow decryption keys unless the issue at hand is one that concerns grave danger to human life or infrastructure. Judicially, this exchange should only occur after a “strict scrutiny” review analogous to First Amendment protections – the crisis at hand must be on a commensurate level of danger for the notion of disclosing decryption keys to even be considered.
While the necessity for the government to obtain decryption keys can be partially justified by the inevitable resurgence of dangerous, intense scenarios; and while tech companies should occasionally cooperate with the government on a “strict scrutiny” basis as decided by a court of law with probable cause; ultimately the government should not have consistent, ubiquitous access to decryption keys.
The implementation of backdoors to software programs which contain millions of sensitive files at a time when cybersecurity threats are at an all-time high is not clairvoyant; and entrusting government servers to house decryption keys only underlines government itself as a feasible target for said sinister actors. Moreover, the technicalities of encryption are not conducive to centralized, singular access; forward secrecy which maximizes security should be emboldened and encouraged by private and public sectors alike. The lessons of the Clipper Chip and Carpenter Vs. United States (2018) help us in collectively grappling with precedent, and though the friction between technology companies and government is a perpetual tale, that which is encrypted must always remain sacred. For when privacy is a relic of a time gone by, security must be paramount.