GEORGETOWN CYBERSECURITY LAW INSTITUTE, Washington, DC,
May 23, 2018
Imagine walking through the front doors of your office on a Thursday morning and immediately receiving a note instructing you not to turn on your work computer for an indefinite period of time. On March 22, this very scenario played out in Atlanta's City Hall, as employees were handed printed instructions that stated, in bold, "Until further notice, please do not log on to your computer." At 5:40 that morning, city officials had been made aware that a particular strain of SamSam ransomware had brought municipal services in Atlanta to a halt. This type of ransomware is known for locking up its victims' files with encryption, temporarily changing those file names to "I'm sorry," and giving victims a week to pay a ransom.
Residents couldn't pay for things like water or parking fines. The municipal courts couldn't validate warrants. Police resorted to writing reports by hand. The city stopped taking employment applications. One city council member lost 16 years of data.
Officials announced that the ransom demand amounted to about $51,000, but have not indicated whether the city paid the ransom. Reports suggest, however, that the city has already spent over $2 million on cybersecurity firms who are helping to restore municipal systems. Atlanta also called in local law enforcement, the FBI, DHS, the Secret Service, and independent forensic experts to help assess what occurred and to protect the city's networks in the future.
Taking a somewhat relaxed approach to cybersecurity, as the situation in Atlanta seems to have demonstrated, is clearly risky, but unfortunately, it is not uncommon. As our reliance on digital technology has increased, both private companies and public sector entities have experienced crippling cyberattacks that brought down essential services. Atlanta is but one example of the pervasiveness of connected technologies and the widespread impact on our lives when those technologies no longer function correctly.
We've reached an inflection point: we now depend upon connected technology to accomplish most of our daily tasks, in both our personal and business lives. At least one forecast predicted that over 20 billion connected devices will be in use by 2020. I hardly need tell the audience at a cybersecurity conference about the nature and scope of our cyber vulnerabilities. What's surprising is not the extent of this vulnerability, but that it has manifested itself in ways that haven't yet had dramatic, society-wide effects, although the Atlanta example is surely a good scare. I suspect we all fear that far more crippling and dangerous cyber incidents are likely in our future, since malicious activity is relatively easy and the increasing pace of connected technology simply increases the target size.
So the time has come - indeed, if it has not already passed - to think seriously about some fundamental questions with respect to our reliance on cyber technologies: How much connected technology do we really want in our daily lives? Do we want the adoption of new connected technologies to be driven purely by innovation and market forces, or should we impose some regulatory constraints? These topics are too big for me to confront here today, and some aspects of these questions will be dealt with at this Conference, so I ask you to keep them in mind. For today, I will concentrate on one area where the legal issues raised by these questions coalesce in a significant way - namely, the privacy implications of our increasingly digital lives. You may be wondering why I, as the NSA General Counsel, chose to discuss the privacy aspects of cybersecurity with you here today. As you probably know, at NSA we have two equally important missions: foreign electronic surveillance (or "signals intelligence" to use the legal term) to provide our country's leaders and the military with the foreign intelligence they need to keep us safe, and a cybersecurity mission, mostly focused on national security systems. I feel NSA can make a contribution to the privacy discussion because we at NSA - as a direct result of our twin missions - are exceptionally knowledgeable about and sensitive to the need to comply with constitutional, statutory and regulatory protections for the privacy and civil liberties of Americans.
Although we continue to forge ahead in the development of new connected technologies, it is clear that the legal framework underpinning those technologies has not kept pace. Despite our reliance on the internet and connected technologies, we simply haven't confronted, as a US society, what it means to have privacy in a digital age.
If you look at other technologies that were considered both novel and significant, regulations may have lagged behind, but we didn't let the technology get too far out in advance before laws and societal norms caught up. Take, for example, automobiles. By the time they became pervasive and affordable enough for most families to own, we had already started putting in place laws governing how they should be operated, how they should be safely constructed and maintained, how they should be inspected, and how all of these new rules should be enforced on the federal, state, and local levels. And our society figured out the resultant impact on infrastructure. Some of those decisions were intuitively obvious and others weren't.
You could make the same analogies with electricity or radio - in each case our society sorted out, in a reasonably timely fashion over a few decades, fundamental questions about public versus private ownership and the extent of substantive regulation, whether for economic or safety or other factors. But not so with cyber. Has there ever been a technology that has become this pervasive, this ubiquitous, and this impactful so quickly? It's no wonder our societal norms and legal structures - and here I'm mostly referring to our concepts of privacy - have failed to keep pace.
We all recognize that an unprecedented amount of our personal information is now available online. Although this certainly improves efficiency, facilitates transactions, and enables accessibility of stored information, it also means that we've made our information vulnerable. It could be exposed through a breach or hack, manipulated by a bad actor, sold to a third party, or, as in the Atlanta example, simply made unavailable.
Ironically, increased cybersecurity, which necessarily includes network monitoring and intrusion detection, also has privacy implications. In order to effectively detect whether an attack is occurring or has occurred, information security systems and professionals need to see the activity happening on their networks. This can include, for example, monitoring what's being sent or received by individuals using the networks.
Privacy in the US is a notion that has traditionally been tied up in the Fourth Amendment, which states the following:
The right of people to be secure in their persons, houses, papers and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
The Fourth Amendment grew out of the experiences of early Americans, who disagreed with the British Crown's use of general warrants to undermine political adversaries by searching for writings or publications that decried the British leadership and its policies. Later, in colonial America, general warrants were again used to gain entry to colonist's homes to search for goods on which taxes had not been paid. As a result, the Fourth Amendment's concept of privacy revolves largely around the notion of being free from government physical intrusion - which is understandable, given the Amendment's text and history.
As you know, however, the word "privacy" itself appears nowhere in the text of the Fourth Amendment. Indeed, if you had reviewed the first hundred years' worth of the Supreme Court's many occasions to examine the Fourth Amendment, you would have found cases focusing mostly on whether a governmental intrusion constituted a physical "search" or "seizure," and on whether evidence obtained in violation of the Fourth Amendment could be excluded from a trial. But not a word about a right to privacy as such. Indeed, in a fascinating case clearly reflecting that Fourth Amendment jurisprudence grew out of this very physical sense of searches and seizures - think of federal agents breaking down doors into your bedroom - Chief Justice William Howard Taft said in the 1928 Olmstead case that wiretapping a telephone conversation didn't amount to a search or seizure, since the evidence in that case was obtained simply by "hearing." So Mr. Olmstead, a manager of a bootleg operation during the Prohibition Era, lost his battle to suppress evidence of some incriminating phone calls and went to jail.
You would have had to wait until 1967, in Katz v. United States, to find the Supreme Court explicitly saying that the Fourth Amendment embraced a right to privacy and that the surveillance of a phone call was a "search" within that amendment. The FBI suspected Katz of transmitting gambling information to clients in other states over a payphone in a public phone booth, so they attached an eavesdropping device to the outside of the booth. The Supreme Court held that the Fourth Amendment's protection against unreasonable searches and seizures covered electronic wiretaps of public payphones. Writing for the majority, Justice Potter Stewart held that the Fourth Amendment protects people, not places, and in his concurrence, Justice Harlan fleshed out a test for identifying a reasonable expectation of privacy. This right was further defined in Smith v. Maryland, a 1979 Supreme Court case in which police asked the telephone company to record the numbers dialed from the telephone of Michael Lee Smith, a man suspected of robbing a woman and then placing threatening calls to her. The Supreme Court held in Smith that there is no reasonable expectation of privacy for information (such as the phone numbers you are dialing) that is voluntarily given to third parties (such as telephone companies). Our Fourth Amendment jurisprudence continued to develop in this manner, with courts largely focusing on the type and location of the surveillance taking place, based upon the facts of each particular case, to determine whether a protected privacy interest was implicated. I might add as an aside that almost nowhere in the case law is the real focus on the substance of the communication, except insofar as you get to consider that by reason of where the communication occurred.
The Supreme Court's review of privacy under the Fourth Amendment culminates at present in the Carpenter case, which is currently pending before the Court. Carpenter deals with whether the warrantless search and seizure of historical cell phone records that reveal the location and movement of the cell phone user violate the Fourth Amendment. The Court's opinion in this case could be structured in one of two ways: either it will be narrowly tailored, with the Court recognizing that the question at issue is tied to specific facts about cell phones, or the opinion will serve as a vehicle to make broader rulings about privacy in this area. Regardless of which structure is chosen and even if the Court writes a brilliant opinion - which I fully expect it to do - the Justices are still evaluating this case through the lens of a changing technology. This opinion, when issued, will likely contribute to our privacy jurisprudence for years to come - and yet, it will be based entirely upon the cell phone technology of today. Twenty years ago, in the era of landline telephones, I'm not certain we could have contemplated the ability to extrapolate a person's location or movements from where their telephone traveled. In another twenty years, will we still be using a device similar to today's cell phone? Probably not, and yet we may well be constrained to apply the Carpenter ruling to whatever device or technology we have in our pockets - or perhaps implanted in our brains - at that time.
This case highlights one of the major limitations in applying Fourth Amendment jurisprudence in the digital era to which I wish to draw your attention. Courts are limited to deciding only the case or controversy before them based upon the set of facts presented, and properly so. But because of the manner in which courts must evaluate cyber devices, our privacy laws in this area are generally backward looking, and intuitively, that feels like the wrong approach when addressing a rapidly developing technology. By contrast, that approach does make sense in, say, the area of tort liability. Law students will recall that Mrs. Palsgraf's injuries on a Long Island Railroad platform were said by Justice Benjamin Cardozo to not be "proximately caused" by the negligence of railway guards. Think about how that legal concept has been applied in the decades since then. Clearly, in the case of tort law, specific cases can yield general principles that are of considerable utility in application to future, but very different, facts.
But I submit that in the case of rapidly developing technology, a case-specific approach, especially one where the legal premise is grounded in the very technology before the court, is inherently problematic. It results in a patchwork quilt of legal precedent about privacy that takes into account only the particular technology directly before the court in each case, which in turn leads to decisions that are sometimes hard to reconcile or are distinguishable only by factors that seem of dubious significance. Most importantly for both the government and the private sector, it yields a set of legal determinations in this area that are, at best, of uneven value in predictive utility. For example, the government needs to know where the lines are to be drawn and equally the private sector wants some degree of certainty as to exactly what will and will not be protected.
Even the Supreme Court has begun to recognize the limitations on its ability to set out a legal framework that suitably marries Fourth Amendment doctrine with emerging technology. In 2012, Justice Sotomayor called into question the third party doctrine's continued practicality in her concurrence in United States v. Jones, writing that "the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties…is ill suited to the digital age." In Riley v. California, which was decided in 2014, Chief Justice Roberts wrote that comparing a search through a wallet, purse, or address book to a search of a cell phone "is like saying a ride on horseback is materially indistinguishable from a flight to the moon. Both are ways of getting from point A to point B, but little else justifies lumping them together. Modern cell phones, as a category, implicate privacy concerns far beyond those implicated by the search of a cigarette pack, wallet, or a purse." At some point, as the Justices have signaled, the quantity of information that an individual shares online or stores in an electronic device may, in aggregate, provide a composite picture of that person that is qualitatively deeper and more insightful than any individual piece of information.
Beyond just its inability as currently applied to keep pace with technology, the Fourth Amendment also suffers from even greater limitations when it comes to protecting your privacy: it is powerless to protect you against any private sector activity. It simply doesn't apply to private companies. This legal framework is far different from the notion of privacy that has driven European lawmaking and court decisions. Unlike the US, the concept of privacy in Europe focuses instead on the dignity of the person and it very much extends to private sector activity. Traditionally, this has resulted in laxer regulation of government surveillance, but much stricter laws about, for example, data protection, credit reporting, and workplace privacy. Europeans value the right to be left alone and protection from public embarrassment or humiliation. These concepts are so important to Europeans that they were woven into the EU's Charter of Fundamental Rights, which mandates respect for private and family life and protection of personal data.
As cyber technology has progressed, the European concept of privacy has resulted in relatively strict laws in Europe about the handling of electronic information. For example, the General Data Protection Regulation, or GDPR, which takes effect in just a few days, applies to all EU organizations, any organizations seeking to offer goods or services to EU subjects, and any companies holding or processing the personal data of people in the EU. Companies who fail to comply with the GDPR can incur penalties of up to 4% of annual global income or €million, whichever is greater. The GDPR seeks to strengthen consent rules, requiring that disclosures be clear, intelligible, and easily accessible, and that it be easy for a user to withdraw consent. It ensures the right to be forgotten, which includes erasing data and preventing its further dissemination. The law also mandates privacy by design; data protection must be designed into systems, rather than added on. If a data breach occurs, companies must provide notification regarding within 72 hours.
Whether we applaud it or not, the European, Japanese and other nations' movement toward comprehensive privacy regulation forces everyone in this digitally connected world to consider how we are going to reconcile different notions of privacy. We've been persistent in scrutinizing government intrusion into our daily lives - which is certainly a worthy focus - but have we done so at the expense of our personal dignity or the integrity of our private information, particularly given the rapid pace of technological development? Some might say that Europe's approach is better suited to manage the privacy challenges posed by the digital age. Let me make crystal clear that the NSA is not, and I am not, advocating for diminished privacy protections or an increased ability to conduct surveillance. Rather, we at NSA feel duty bound to discuss these types of issues, and we'd like to do so transparently and openly to help reach a consensus as to the best approach.
My key point is that I believe we no longer have the luxury of addressing this issue in an ad hoc fashion through our court system, which is largely where our privacy laws have been shaped to date. With Europe pushing ever more aggressive data protection laws, the choice may soon be out of our hands. Companies operating internationally are being forced to adapt their policies and procedures to adhere to regulations implemented in foreign countries. If we want to play a role in shaping those policies to suit our own notions of privacy, we need an overarching effort to address privacy and digital technology here in the US.
The public and private sectors will need to take a holistic approach to addressing privacy concerns associated with our increasing reliance on digital technologies. Similar to the way that new drugs must be reviewed and approved for safety and efficacy before they come to market, perhaps we need laws or regulations requiring review of privacy and cybersecurity safeguards in new connected products before they can be made available to the public. Perhaps, as in Europe, we need stronger notice and consent requirements to regulate how our personal information can be used, shared, or disseminated online. Or perhaps, in some industries, we need to mandate the adoption of low-tech redundancies to safeguard against the loss or manipulation of personal information stored online. This need not necessarily entail government regulation, as industry-generated approaches might be sufficient - but my point is simply that we must have a societal dialogue about how we want to confront the problem.
We also need to consider what privacy means to us here in the US. Because we've emphasized freedom from government surveillance in our current privacy regime, and because the fact-specific legal analysis of that surveillance has focused, as discussed above, on the type and location of the surveillance, the same piece of electronic personal information may be protected from interception by the government, but could be disseminated, sold, or otherwise used by a private company with few, if any, limitations. In only narrow areas, such as HIPAA regulations for health records and Fair Credit Reporting Act requirements for financial records, do our laws focus on the type of information at issue. As I noted earlier about the absence of a focus on the content of communications, we could, for example, have a privacy scheme that was dependent in greater part on the substantive nature of communications rather than how communications are collected. To address these inconsistencies that have grown up around our legal privacy framework, we must evaluate carefully the manner in which private companies rely on connected technology to carry out their business activities. This includes considering not only how and when they collect personal information from customers, whether to store it online, and whether and to whom it should be disseminated, but also whether relying solely upon networked devices and systems is even the right choice for certain activities when particular sensitivities may be involved.
We also can't forget that each one of us has a great deal of personal responsibility for our own private information. Regard less of what steps the government ultimately takes, we need to maintain awareness of and exercise some amount of discretion about how we are exposing our personal data over the internet.
As you spend the next day thinking about the cybersecurity problem and how to address it, please continue to keep in mind the possible ramifications on privacy and consider how our laws should be structured to account for those. Trends point toward the fact that privacy norms are likely to be the most pressing issue with respect to digital technology over the next decade. There are no longer excuses to be made for being caught flat-footed with respect to the security of our networks and systems, particularly when those systems store or transmit personal information. As many have noted, to the extent we do not coalesce around an accepted concept of privacy, and that failure delays or impairs an effective cybersecurity program, then Americans' privacy is put at even greater risk.