Since the 1990s, U.S. law enforcement has expressed concern about “going dark,” defined as an inability to access encrypted communications or data even with a court order.
Silicon Valley companies are rolling out encrypted products that allow users alone to access their data, and in the wake of the Paris and San Bernardino terrorist attacks, law enforcement officials argue that their fears are being realized.
The FBI is engaged in a public battle with Apple over access to data stored on the iPhone of one of the San Bernardino attackers and cautions that encrypted messaging apps could hinder the organization’s ability to uncover terror plots.
To prevent future attacks, law enforcement has urged U.S. tech giants to build in “backdoors” or “front doors” to their products — essentially, the technical ability to decrypt communications pursuant to a warrant.
Silicon Valley argues that any solution allowing someone other than the data’s owner to decrypt communications could be exploited by criminals and state actors, weakening security for everyone.
Moreover, the tech world points out that many countries and groups have developed their own products and services, meaning anti-encryption policies will only hurt the competitiveness of U.S. companies without providing access to a great deal of suspect communications.
Despite the technologists’ claim of the intractability of the problem, U.S. officials insist there is a workaround and have sought to compel tech firms’ assistance to break into encrypted devices.
This debate has been dominated by absolutists. Some cybersecurity experts and privacy advocates are loath to concede that going dark is a problem at all, while many in law enforcement are scornful of what they see as decisions motivated by business interests and remain adamant that anything less than a real-time, on-demand decryption capability is unacceptable.
It does not have to be like this. There are solutions that allow law enforcement to gather the evidence it needs without introducing encryption backdoors. Here are three worthy of consideration:
First, Congress could empower law enforcement to exploit existing security flaws. Put simply, law enforcement should have the ability to hack into a suspect’s smartphone or computer with a court order, such as a warrant.
As some prominent computer security experts have argued, lawful hacking would allow authorities to use existing vulnerabilities to obtain evidence instead of creating new backdoors.
Although this would entail law enforcement adopting the same techniques as criminals, tight judicial oversight would ensure it is employed responsibly, like the restrictions that already apply to wiretapping.
Second, the executive branch should explore developing a national capacity to decrypt data for law enforcement. The challenge of going dark affects state and local law enforcement the most: They are the least likely to have the resources and technical capabilities to decrypt data relevant to investigations. Creating a national decryption capability, housed within the FBI and drawing upon the expertise of the National Security Agency, would provide assistance to state and local law enforcement, similar to what the FBI provides for fingerprint and biometric data.
Third, and most important, law enforcement must improve its tech literacy. Law enforcement was confronted with a problem akin to going dark when, in the 1990s, organized-crime suspects started using disposable phones that hampered wiretaps. Nevertheless, law enforcement, and arrests and prosecution of organized-crime suspects continued.
Running into an encrypted communication does not necessarily mean an evidence trail will go cold. Encryption can occur on a device, as the data are transmitted and when they are stored in the cloud. Encryption in one avenue doesn’t necessarily mean the other two avenues will also be encrypted.
For example, Apple can access the content of an encrypted iPhone if it has been backed up to iCloud. Recognizing how and when encryption occurs, and the different security offerings of service providers, may help law enforcement. Better tech literacy might have avoided the current Apple-FBI fight. The FBI could have obtained more information from the San Bernardino attacker’s iPhone if it had not hastily ordered the county to reset his iCloud password.
These proposals will not be fully acceptable to technologists or law enforcement. Some in tech will recoil at the idea of the NSA supporting law enforcement, while others will resent the need to keep pace with Silicon Valley’s offerings. Nevertheless, a one-size-fits-all solution isn’t likely. The debate has been going on in one form or another for more than 20 years. It’s time to consider some realistic solutions.